11 resultados para Software Design Pattern

em Universidad de Alicante


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hardware/Software partitioning (HSP) is a key task for embedded system co-design. The main goal of this task is to decide which components of an application are to be executed in a general purpose processor (software) and which ones, on a specific hardware, taking into account a set of restrictions expressed by metrics. In last years, several approaches have been proposed for solving the HSP problem, directed by metaheuristic algorithms. However, due to diversity of models and metrics used, the choice of the best suited algorithm is an open problem yet. This article presents the results of applying a fuzzy approach to the HSP problem. This approach is more flexible than many others due to the fact that it is possible to accept quite good solutions or to reject other ones which do not seem good. In this work we compare six metaheuristic algorithms: Random Search, Tabu Search, Simulated Annealing, Hill Climbing, Genetic Algorithm and Evolutionary Strategy. The presented model is aimed to simultaneously minimize the hardware area and the execution time. The obtained results show that Restart Hill Climbing is the best performing algorithm in most cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El particionado hardware/software es una tarea fundamental en el co-diseño de sistemas embebidos. En ella se decide, teniendo en cuenta las métricas de diseño, qué componentes se ejecutarán en un procesador de propósito general (software) y cuáles en un hardware específico. En los últimos años se han propuesto diversas soluciones al problema del particionado dirigidas por algoritmos metaheurísticos. Sin embargo, debido a la diversidad de modelos y métricas utilizadas, la elección del algoritmo más apropiado sigue siendo un problema abierto. En este trabajo se presenta una comparación de seis algoritmos metaheurísticos: Búsqueda aleatoria (Random search), Búsqueda tabú (Tabu search), Recocido simulado (Simulated annealing), Escalador de colinas estocástico (Stochastic hill climbing), Algoritmo genético (Genetic algorithm) y Estrategia evolutiva (Evolution strategy). El modelo utilizado en la comparación está dirigido a minimizar el área ocupada y el tiempo de ejecución, las restricciones del modelo son consideradas como penalizaciones para incluir en el espacio de búsqueda otras soluciones. Los resultados muestran que los algoritmos Escalador de colinas estocástico y Estrategia evolutiva son los que mejores resultados obtienen en general, seguidos por el Algoritmo genético.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The footwear industry is a traditional craft sector, where technological advances are difficult to implement owing to the complexity of the processes being carried out, and the level of precision demanded by most of them. The shoe last joining operation is one clear example, where two halves from different lasts are put together, following a specifically traditional process, to create a new one. Existing surface joining techniques analysed in this paper are not well adapted to shoe last design and production processes, which makes their implementation in the industry difficult. This paper presents an alternative surface joining technique, inspired by the traditional work of lastmakers. This way, lastmakers will be able to easily adapt to the new tool and make the most out of their know-how. The technique is based on the use of curve networks that are created on the surfaces to be joined, instead of using discrete data. Finally, a series of joining tests are presented, in which real lasts were successfully joined using a commercial last design software. The method has shown to be valid, efficient, and feasible within the sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Commercial off-the-shelf microprocessors are the core of low-cost embedded systems due to their programmability and cost-effectiveness. Recent advances in electronic technologies have allowed remarkable improvements in their performance. However, they have also made microprocessors more susceptible to transient faults induced by radiation. These non-destructive events (soft errors), may cause a microprocessor to produce a wrong computation result or lose control of a system with catastrophic consequences. Therefore, soft error mitigation has become a compulsory requirement for an increasing number of applications, which operate from the space to the ground level. In this context, this paper uses the concept of selective hardening, which is aimed to design reduced-overhead and flexible mitigation techniques. Following this concept, a novel flexible version of the software-based fault recovery technique known as SWIFT-R is proposed. Our approach makes possible to select different registers subsets from the microprocessor register file to be protected on software. Thus, design space is enriched with a wide spectrum of new partially protected versions, which offer more flexibility to designers. This permits to find the best trade-offs between performance, code size, and fault coverage. Three case studies have been developed to show the applicability and flexibility of the proposal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of fault tolerant systems is gaining importance in large domains of embedded applications where design constrains are as important as reliability. New software techniques, based on selective application of redundancy, have shown remarkable fault coverage with reduced costs and overheads. However, the large number of different solutions provided by these techniques, and the costly process to assess their reliability, make the design space exploration a very difficult and time-consuming task. This paper proposes the integration of a multi-objective optimization tool with a software hardening environment to perform an automatic design space exploration in the search for the best trade-offs between reliability, cost, and performance. The first tool is commanded by a genetic algorithm which can simultaneously fulfill many design goals thanks to the use of the NSGA-II multi-objective algorithm. The second is a compiler-based infrastructure that automatically produces selective protected (hardened) versions of the software and generates accurate overhead reports and fault coverage estimations. The advantages of our proposal are illustrated by means of a complex and detailed case study involving a typical embedded application, the AES (Advanced Encryption Standard).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La partición hardware/software es una etapa clave dentro del proceso de co-diseño de los sistemas embebidos. En esta etapa se decide qué componentes serán implementados como co-procesadores de hardware y qué componentes serán implementados en un procesador de propósito general. La decisión es tomada a partir de la exploración del espacio de diseño, evaluando un conjunto de posibles soluciones para establecer cuál de estas es la que mejor balance logra entre todas las métricas de diseño. Para explorar el espacio de soluciones, la mayoría de las propuestas, utilizan algoritmos metaheurísticos; destacándose los Algoritmos Genéticos, Recocido Simulado. Esta decisión, en muchos casos, no es tomada a partir de análisis comparativos que involucren a varios algoritmos sobre un mismo problema. En este trabajo se presenta la aplicación de los algoritmos: Escalador de Colinas Estocástico y Escalador de Colinas Estocástico con Reinicio, para resolver el problema de la partición hardware/software. Para validar el empleo de estos algoritmos se presenta la aplicación de este algoritmo sobre un caso de estudio, en particular la partición hardware/software de un codificador JPEG. En todos los experimentos es posible apreciar que ambos algoritmos alcanzan soluciones comparables con las obtenidas por los algoritmos utilizados con más frecuencia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This case report reports the visual rehabilitation obtained with the fitting of a new design of full scleral contact lens (ICD 16.5 contact lens, Paragon Vision Sciences, distributed by Lenticon, Madrid, Spain) in a cornea with advanced keratoconus and previous implantation of intracorneal ring segment with a very limited effect. This eye had a refraction of –3.00 × 55° cylinder, providing a visual acuity of 0.5 LogMAR. The topographic pattern was very irregular with the presence of a significant central protrusion and a significant central corneal thinning. Some previous unsuccessful fittings have been performed with corneal and corneal-scleral lenses. A comfortable wearing was achieved with a fully scleral contact lens of 4600 μm of sagittal height, optical power of –11.25 D, and providing an apical clearance of 196 μm. A visual acuity of 0.0 LogMAR combined with a relevant aberrometric improvement was achieved with this contact lens. The patient was completely satisfied with the fitting. The result was maintained during 1 year after the fitting. Full scleral lenses are then able to provide comfortable wear and a significant increase in visual acuity combined with a significant improvement in the visual quality in eyes with advanced keratoconus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The general purpose of the EQUIFASE Conference is to promote the Scientific and Technologic exchange between people from both the academic and the industrial environment in the field of Phase Equilibria and Thermodynamic Properties for the Design of Chemical Processes. Topics: Measurement of Thermodynamic Properties. Phase Equilibria and Chemical Equilibria. Theory and Modelling. Alternative Solvents. Supercritical Fluids. Ionic Liquids. Energy. Gas and oil. Petrochemicals. Environment and sustainability. Biomolecules and Biotechnology. Product and Process Design. Databases and Software. Education.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To assess the usefulness of microperimetry (MP) as an additional objective method for characterizing the fixation pattern in nystagmus. Design: Prospective study. Participants: Fifteen eyes of 8 subjects (age, 12–80 years) with nystagmus from the Lluís Alcanyís Foundation (University of Valencia, Spain) were included. Methods: All patients had a comprehensive ophthalmologic examination including a microperimetric examination (MAIA, CenterVue, Padova, Italy). The following microperimetric parameters were evaluated: average threshold (AT), macular integrity index (MI), fixating points within a circle of 1° (P1) and 2° of radius (P2), bivariate contour ellipse area (BCEA) considering 63% and 95% of fixating points, and horizontal and vertical axes of that ellipse. Results: In monocular conditions, 6 eyes showed a fixation classified as stable, 6 eyes showed a relatively unstable fixation, and 3 eyes showed an unstable fixation. Statistically significant differences were found between the horizontal and vertical components of movement (p = 0.001), as well as in their ranges (p < 0.001). Intereye comparison showed differences between eyes in some subjects, but only statistically significant differences were found in the fixation coordinates X and Y (p < 0.001). No significant intereye differences were found between microperimetric parameters. Between monocular and binocular conditions, statistically significant differences in the X and Y coordinates were found in all eyes (p < 0.02) except one. No significant differences were found between MP parameters for monocular or binocular conditions. Strong correlations of corrected distance visual acuity (CDVA) with AT (r = 0.812, p = 0.014), MI (r = –0.812, p = 0.014), P1 (r = 0.729, p = 0.002), horizontal diameter of BCEA (r = –0.700, p = 0.004), and X range (r = –0.722, p = 0.005) were found. Conclusions: MP seems to be a useful technology for the characterization of the fixation pattern in nystagmus, which seems to be related to the level of visual acuity achieved by the patient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, the requirement of professional skills to university students is constantly increasing in our society. In our opinion, the content offered in official degrees need to be nourished with different variables, enriching their global professional knowledge in a parallel way; that is why, in recent years, there is a great multiplicity of complementary courses at university. One of the most socially demanded technical requirements within the architectural, design or engineering field is the management of 3D drawing software, becoming an indispensable reality in these sectors. Thus, this specific training becomes essential over two-dimension traditional design, because the inclusion of great possibilities of spatial development that go beyond conventional orthographic projections (plans, sections or elevations), allowing modelling and rotation of the selected items from multiple angles and perspectives. Therefore, this paper analyzes the teaching methodology of a complementary course for those technicians in the construction industry interested in computer-aided design, using modelling (SketchupMake) and rendering programs (Kerkythea). The course is developed from the technician point of view, by learning computer management and its application to professional development from a more general to a more specific view through practical examples. The proposed methodology is based on the development of real examples in different professional environments such as rehabilitation, new constructions, opening projects or architectural design. This multidisciplinary contribution improves criticism of students in different areas, encouraging new learning strategies and the independent development of three-dimensional solutions. Thus, the practical implementation of new situations, even suggested by the students themselves, ensures active participation, saving time during the design process and the increase of effectiveness when generating elements which may be represented, moved or virtually tested. In conclusion, this teaching-learning methodology improves the skills and competencies of students to face the growing professional demands of society. After finishing the course, technicians not only improved their expertise in the field of drawing but they also enhanced their capacity for spatial vision; both essential qualities in these sectors that can be applied to their professional development with great success.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Object tracking with subpixel accuracy is of fundamental importance in many fields since it provides optimal performance at relatively low-cost. Although there are many theoretical proposals that lead to resolution increments of several orders of magnitude, in practice, this resolution is limited by the imaging systems. In this paper we propose and demonstrate through numerical models a realistic limit for subpixel accuracy. The final result is that maximum achievable resolution enhancement is connected with the dynamic range of the image, i.e. the detection limit is 1/2^(nr.bits). Results here presented may help to proper design of superresolution experiments in microscopy, surveillance, defense and other fields.