8 resultados para Embedded Constructions

em Universidad de Alicante


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hardware/Software partitioning (HSP) is a key task for embedded system co-design. The main goal of this task is to decide which components of an application are to be executed in a general purpose processor (software) and which ones, on a specific hardware, taking into account a set of restrictions expressed by metrics. In last years, several approaches have been proposed for solving the HSP problem, directed by metaheuristic algorithms. However, due to diversity of models and metrics used, the choice of the best suited algorithm is an open problem yet. This article presents the results of applying a fuzzy approach to the HSP problem. This approach is more flexible than many others due to the fact that it is possible to accept quite good solutions or to reject other ones which do not seem good. In this work we compare six metaheuristic algorithms: Random Search, Tabu Search, Simulated Annealing, Hill Climbing, Genetic Algorithm and Evolutionary Strategy. The presented model is aimed to simultaneously minimize the hardware area and the execution time. The obtained results show that Restart Hill Climbing is the best performing algorithm in most cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Commercial off-the-shelf microprocessors are the core of low-cost embedded systems due to their programmability and cost-effectiveness. Recent advances in electronic technologies have allowed remarkable improvements in their performance. However, they have also made microprocessors more susceptible to transient faults induced by radiation. These non-destructive events (soft errors), may cause a microprocessor to produce a wrong computation result or lose control of a system with catastrophic consequences. Therefore, soft error mitigation has become a compulsory requirement for an increasing number of applications, which operate from the space to the ground level. In this context, this paper uses the concept of selective hardening, which is aimed to design reduced-overhead and flexible mitigation techniques. Following this concept, a novel flexible version of the software-based fault recovery technique known as SWIFT-R is proposed. Our approach makes possible to select different registers subsets from the microprocessor register file to be protected on software. Thus, design space is enriched with a wide spectrum of new partially protected versions, which offer more flexibility to designers. This permits to find the best trade-offs between performance, code size, and fault coverage. Three case studies have been developed to show the applicability and flexibility of the proposal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The design of fault tolerant systems is gaining importance in large domains of embedded applications where design constrains are as important as reliability. New software techniques, based on selective application of redundancy, have shown remarkable fault coverage with reduced costs and overheads. However, the large number of different solutions provided by these techniques, and the costly process to assess their reliability, make the design space exploration a very difficult and time-consuming task. This paper proposes the integration of a multi-objective optimization tool with a software hardening environment to perform an automatic design space exploration in the search for the best trade-offs between reliability, cost, and performance. The first tool is commanded by a genetic algorithm which can simultaneously fulfill many design goals thanks to the use of the NSGA-II multi-objective algorithm. The second is a compiler-based infrastructure that automatically produces selective protected (hardened) versions of the software and generates accurate overhead reports and fault coverage estimations. The advantages of our proposal are illustrated by means of a complex and detailed case study involving a typical embedded application, the AES (Advanced Encryption Standard).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of applications as well as the services for mobile systems faces a varied range of devices with very heterogeneous capabilities whose response times are difficult to predict. The research described in this work aims to respond to this issue by developing a computational model that formalizes the problem and that defines adjusting computing methods. The described proposal combines imprecise computing strategies with cloud computing paradigms in order to provide flexible implementation frameworks for embedded or mobile devices. As a result, the imprecise computation scheduling method on the workload of the embedded system is the solution to move computing to the cloud according to the priority and response time of the tasks to be executed and hereby be able to meet productivity and quality of desired services. A technique to estimate network delays and to schedule more accurately tasks is illustrated in this paper. An application example in which this technique is experimented in running contexts with heterogeneous work loading for checking the validity of the proposed model is described.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Comunicación presentada en las V Jornadas de Computación Empotrada, Valladolid, 17-19 Septiembre 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to the importance of rehabilitation and recovery of Architectural Heritage in the live of people, this paper is aimed to strengthen the traditional methods of stone vaults calculation taking advantage of the technological characteristics of the powerful program ANSYS Workbench. As an example of this, it could find out the possible pathologies that could arise during the construction history of the building. To limit this research, the upper vault of the main chapel of the Santiago parish church in Orihuela -Alicante- is selected as a reference which is a Jeronimo Quijano´s important building work in the XVI century in the Renaissance. Moreover, it is an innovative stone masonry vault that consists of 8 double intercrossed arches with each other and braced by severies. During the seventeenth century there was a lantern in the central cap and it is unknown why it was removed. Its construction could justify the original constructive solution with intercrossed arches that freed the center to create a more enlightened and comfortable presbytery. By similarity with other Quijano’s works, it is considered a small lantern drilling the central spherical cap. It is proposed to carry out a comparative study of it with different architectural solutions from the same period and based on several common parameters such as: a vault of square plant with spherical surround, intercrossed arches, a possible lantern, the dimension of the permitted space, similar states of loads and compact limestone masonry. The three solutions are mainly differentiated by their size and the type of lantern and its comparison lets us know which one is the most resistant and stable. The other two building works maintain some connection with the Quijano's professional scope. It has selected the particular case of the Communion chapel of the Basilica in Elche (a large prismatic lantern with a large cylindrical drum that starts from the own arches and an upper hemispherical dome), for its conservation, its proximity to Orihuela and its implementation during the century XVIII. Finally, a significant Dome Spanish Renaissance complete the selection: a cross vault of the Benavides Chapel of the Saint Francisco Convent in Baeza - Jaén-, designed by Andres of Vandelvira in the sixteenth century (a large hemispherical dome that starts from the own arcs). To simplify the calculation and standardize the work that have to be contrasted, all of them were considered with some similar characteristics: 30 cm constant thickness, the intercrossed arches were specifically analyzed and had identical loads, Young's modulus and Poisson's ratio. Regarding the calculation solutions, in general terms, the compressive stresses predominate, influencing on it the joint collaboration of the filling material on the vault, the vault itself, the thick side walls, the buttresses and the top cover weight . In addition, the three solutions are suitable, being the Orihuela one the safest and the Baeza one the riskiest for its large dimensions. Thus, the idea of intercrossed arches with suitable thickness would allow carry out the heaviest lantern and this would confirm it as a Renaissance architectural typology built in stone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information technologies (IT) currently represent 2% of CO2 emissions. In recent years, a wide variety of IT solutions have been proposed, focused on increasing the energy efficiency of network data centers. Monitoring is one of the fundamental pillars of these systems, providing the information necessary for adequate decision making. However, today’s monitoring systems (MSs) are partial, specific and highly coupled solutions. This study proposes a model for monitoring data centers that serves as a basis for energy saving systems, offered as a value-added service embedded in a device with low cost and power consumption. The proposal is general in nature, comprehensive, scalable and focused on heterogeneous environments, and it allows quick adaptation to the needs of changing and dynamic environments. Further, a prototype of the system has been implemented in several devices, which has allowed validation of the proposal in addition to identification of the minimum hardware profile required to support the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is an increasing concern to reduce the cost and overheads during the development of reliable systems. Selective protection of most critical parts of the systems represents a viable solution to obtain a high level of reliability at a fraction of the cost. In particular to design a selective fault mitigation strategy for processor-based systems, it is mandatory to identify and prioritize the most vulnerable registers in the register file as best candidates to be protected (hardened). This paper presents an application-based metric to estimate the criticality of each register from the microprocessor register file in microprocessor-based systems. The proposed metric relies on the combination of three different criteria based on common features of executed applications. The applicability and accuracy of our proposal have been evaluated in a set of applications running in different microprocessors. Results show a significant improvement in accuracy compared to previous approaches and regardless of the underlying architecture.