15 resultados para SOFTWARE-RELIABILITY MODELS

em Universidad de Alicante


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hardware/Software partitioning (HSP) is a key task for embedded system co-design. The main goal of this task is to decide which components of an application are to be executed in a general purpose processor (software) and which ones, on a specific hardware, taking into account a set of restrictions expressed by metrics. In last years, several approaches have been proposed for solving the HSP problem, directed by metaheuristic algorithms. However, due to diversity of models and metrics used, the choice of the best suited algorithm is an open problem yet. This article presents the results of applying a fuzzy approach to the HSP problem. This approach is more flexible than many others due to the fact that it is possible to accept quite good solutions or to reject other ones which do not seem good. In this work we compare six metaheuristic algorithms: Random Search, Tabu Search, Simulated Annealing, Hill Climbing, Genetic Algorithm and Evolutionary Strategy. The presented model is aimed to simultaneously minimize the hardware area and the execution time. The obtained results show that Restart Hill Climbing is the best performing algorithm in most cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El particionado hardware/software es una tarea fundamental en el co-diseño de sistemas embebidos. En ella se decide, teniendo en cuenta las métricas de diseño, qué componentes se ejecutarán en un procesador de propósito general (software) y cuáles en un hardware específico. En los últimos años se han propuesto diversas soluciones al problema del particionado dirigidas por algoritmos metaheurísticos. Sin embargo, debido a la diversidad de modelos y métricas utilizadas, la elección del algoritmo más apropiado sigue siendo un problema abierto. En este trabajo se presenta una comparación de seis algoritmos metaheurísticos: Búsqueda aleatoria (Random search), Búsqueda tabú (Tabu search), Recocido simulado (Simulated annealing), Escalador de colinas estocástico (Stochastic hill climbing), Algoritmo genético (Genetic algorithm) y Estrategia evolutiva (Evolution strategy). El modelo utilizado en la comparación está dirigido a minimizar el área ocupada y el tiempo de ejecución, las restricciones del modelo son consideradas como penalizaciones para incluir en el espacio de búsqueda otras soluciones. Los resultados muestran que los algoritmos Escalador de colinas estocástico y Estrategia evolutiva son los que mejores resultados obtienen en general, seguidos por el Algoritmo genético.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three sets of laboratory column experimental results concerning the hydrogeochemistry of seawater intrusion have been modelled using two codes: ACUAINTRUSION (Chemical Engineering Department, University of Alicante) and PHREEQC (U.S.G.S.). These reactive models utilise the hydrodynamic parameters determined using the ACUAINTRUSION TRANSPORT software and fit the chloride breakthrough curves perfectly. The ACUAINTRUSION code was improved, and the instabilities were studied relative to the discretisation. The relative square errors were obtained using different combinations of the spatial and temporal steps: the global error for the total experimental data and the partial error for each element. Good simulations for the three experiments were obtained using the ACUAINTRUSION software with slight variations in the selectivity coefficients for both sediments determined in batch experiments with fresh water. The cation exchange parameters included in ACUAINTRUSION are those reported by the Gapon convention with modified exponents for the Ca/Mg exchange. PHREEQC simulations performed using the Gains-Thomas convention were unsatisfactory, with the exchange coefficients from the database of PHREEQC (or range), but those determined with fresh water – natural sediment allowed only an approximation to be obtained. For the treated sediment, the adjusted exchange coefficients were determined to improve the simulation and are vastly different from those from the database of PHREEQC or batch experiment values; however, these values fall in an order similar to the others determined under dynamic conditions. Different cation concentrations were simulated using two different software packages; this disparity could be attributed to the defined selectivity coefficients that affect the gypsum equilibrium. Consequently, different calculated sulphate concentrations are obtained using each type of software; a smaller mismatch was predicted using ACUAINTRUSION. In general, the presented simulations by ACUAINTRUSION and PHREEQC produced similar results, making predictions consistent with the experimental data. However, the simulated results are not identical to the experimental data; sulphate (total S) is overpredicted by both models, most likely due to such factors as the kinetics of gypsum, the possible variations in the exchange coefficients due to salinity and the neglect of other processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: Today’s project managers have a myriad of methods to choose from for the development of software applications. However, they lack empirical data about the character of these methods in terms of usefulness, ease of use or compatibility, all of these being relevant variables to assess the developer’s intention to use them. Objective: To compare three methods, each following a different paradigm (Model-Driven, Model-Based and Code-Centric) with respect to their adoption potential by junior software developers engaged in the development of the business layer of a Web 2.0 application. Method: We have conducted a quasi-experiment with 26 graduate students of the University of Alicante. The application developed was a Social Network, which was organized around a fixed set of modules. Three of them, similar in complexity, were used for the experiment. Subjects were asked to use a different method for each module, and then to answer a questionnaire that gathered their perceptions during such use. Results: The results show that the Model-Driven method is regarded as the most useful, although it is also considered the least compatible with previous developers’ experiences. They also show that junior software developers feel comfortable with the use of models, and that they are likely to use them if the models are accompanied by a Model-Driven development environment. Conclusions: Despite their relatively low level of compatibility, Model-Driven development methods seem to show a great potential for adoption. That said, however, further experimentation is needed to make it possible to generalize the results to a different population, different methods, other languages and tools, different domains or different application sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, a new methodology is presented to obtain representation models for a priori relation z = u(x1, x2, . . . ,xn) (1), with a known an experimental dataset zi; x1i ; x2i ; x3i ; . . . ; xni i=1;2;...;p· In this methodology, a potential energy is initially defined over each possible model for the relationship (1), what allows the application of the Lagrangian mechanics to the derived system. The solution of the Euler–Lagrange in this system allows obtaining the optimal solution according to the minimal action principle. The defined Lagrangian, corresponds to a continuous medium, where a n-dimensional finite elements model has been applied, so it is possible to get a solution for the problem solving a compatible and determined linear symmetric equation system. The computational implementation of the methodology has resulted in an improvement in the process of get representation models obtained and published previously by the authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Commercial off-the-shelf microprocessors are the core of low-cost embedded systems due to their programmability and cost-effectiveness. Recent advances in electronic technologies have allowed remarkable improvements in their performance. However, they have also made microprocessors more susceptible to transient faults induced by radiation. These non-destructive events (soft errors), may cause a microprocessor to produce a wrong computation result or lose control of a system with catastrophic consequences. Therefore, soft error mitigation has become a compulsory requirement for an increasing number of applications, which operate from the space to the ground level. In this context, this paper uses the concept of selective hardening, which is aimed to design reduced-overhead and flexible mitigation techniques. Following this concept, a novel flexible version of the software-based fault recovery technique known as SWIFT-R is proposed. Our approach makes possible to select different registers subsets from the microprocessor register file to be protected on software. Thus, design space is enriched with a wide spectrum of new partially protected versions, which offer more flexibility to designers. This permits to find the best trade-offs between performance, code size, and fault coverage. Three case studies have been developed to show the applicability and flexibility of the proposal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The modeling of complex dynamic systems depends on the solution of a differential equations system. Some problems appear because we do not know the mathematical expressions of the said equations. Enough numerical data of the system variables are known. The authors, think that it is very important to establish a code between the different languages to let them codify and decodify information. Coding permits us to reduce the study of some objects to others. Mathematical expressions are used to model certain variables of the system are complex, so it is convenient to define an alphabet code determining the correspondence between these equations and words in the alphabet. In this paper the authors begin with the introduction to the coding and decoding of complex structural systems modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In an open system, each disequilibrium causes a force. Each force causes a flow process, these being represented by a flow variable formally written as an equation called flow equation, and if each flow tends to equilibrate the system, these equations mathematically represent the tendency to that equilibrium. In this paper, the authors, based on the concepts of forces and conjugated fluxes and dissipation function developed by Onsager and Prigogine, they expose the following hypothesis: Is replaced in Prigogine’s Theorem the flow by its equation or by a flow orbital considering conjugate force as a gradient. This allows to obtain a dissipation function for each flow equation and a function of orbital dissipation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a study and analysis of surface normal-base descriptors for 3D object recognition. Specifically, we evaluate the behaviour of descriptors in the recognition process using virtual models of objects created from CAD software. Later, we test them in real scenes using synthetic objects created with a 3D printer from the virtual models. In both cases, the same virtual models are used on the matching process to find similarity. The difference between both experiments is in the type of views used in the tests. Our analysis evaluates three subjects: the effectiveness of 3D descriptors depending on the viewpoint of camera, the geometry complexity of the model and the runtime used to do the recognition process and the success rate to recognize a view of object among the models saved in the database.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mathematical models of the complex reality are texts belonging to a certain literature that is written in a semi-formal language, denominated L(MT) by the authors whose laws linguistic mathematics have been previously defined. This text possesses linguistic entropy that is the reflection of the physical entropy of the processes of real world that said text describes. Through the temperature of information defined by Mandelbrot, the authors begin a text-reality thermodynamic theory that drives to the existence of information attractors, or highly structured point, settling down a heterogeneity of the space text, the same one that of ontologic space, completing the well-known law of Saint Mathew, of the General Theory of Systems and formulated by Margalef saying: “To the one that has more he will be given, and to the one that doesn't have he will even be removed it little that it possesses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the authors extend and generalize the methodology based on the dynamics of systems with the use of differential equations as equations of state, allowing that first order transformed functions not only apply to the primitive or original variables, but also doing so to more complex expressions derived from them, and extending the rules that determine the generation of transformed superior to zero order (variable or primitive). Also, it is demonstrated that for all models of complex reality, there exists a complex model from the syntactic and semantic point of view. The theory is exemplified with a concrete model: MARIOLA model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software-based techniques offer several advantages to increase the reliability of processor-based systems at very low cost, but they cause performance degradation and an increase of the code size. To meet constraints in performance and memory, we propose SETA, a new control-flow software-only technique that uses assertions to detect errors affecting the program flow. SETA is an independent technique, but it was conceived to work together with previously proposed data-flow techniques that aim at reducing performance and memory overheads. Thus, SETA is combined with such data-flow techniques and submitted to a fault injection campaign. Simulation and neutron induced SEE tests show high fault coverage at performance and memory overheads inferior to the state-of-the-art.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Object tracking with subpixel accuracy is of fundamental importance in many fields since it provides optimal performance at relatively low-cost. Although there are many theoretical proposals that lead to resolution increments of several orders of magnitude, in practice, this resolution is limited by the imaging systems. In this paper we propose and demonstrate through numerical models a realistic limit for subpixel accuracy. The final result is that maximum achievable resolution enhancement is connected with the dynamic range of the image, i.e. the detection limit is 1/2^(nr.bits). Results here presented may help to proper design of superresolution experiments in microscopy, surveillance, defense and other fields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrity assurance of configuration data has a significant impact on microcontroller-based systems reliability. This is especially true when running applications driven by events which behavior is tightly coupled to this kind of data. This work proposes a new hybrid technique that combines hardware and software resources for detecting and recovering soft-errors in system configuration data. Our approach is based on the utilization of a common built-in microcontroller resource (timer) that works jointly with a software-based technique, which is responsible to periodically refresh the configuration data. The experiments demonstrate that non-destructive single event effects can be effectively mitigated with reduced overheads. Results show an important increase in fault coverage for SEUs and SETs, about one order of magnitude.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to examine the reliability and validity of the School Anxiety Inventory (SAI) using a sample of 646 Slovenian adolescents (48% boys), ranging in age from 12 to 19 years. Single confirmatory factor analyses replicated the correlated four-factor structure of scores on the SAI for anxiety-provoking school situations (Anxiety about School Failure and Punishment, Anxiety about Aggression, Anxiety about Social Evaluation, and Anxiety about Academic Evaluation), and the three-factor structure of the anxiety response systems (Physiological Anxiety, Cognitive Anxiety, and Behavioral Anxiety). Equality of factor structures was compared using multigroup confirmatory factor analyses. Measurement invariance for the four- and three-factor models was obtained across gender and school-level samples. The scores of the instrument showed high internal reliability and adequate test–retest reliability. The concurrent validity of the SAI scores was also examined through its relationship with the Social Anxiety Scale for Adolescents (SASA) scores and the Questionnaire about Interpersonal Difficulties for Adolescents (QIDA) scores. Correlations of the SAI scores with scores on the SASA and the QIDA were of low to moderate effect sizes.