8 resultados para embedded level set
em Universidad de Alicante
Resumo:
Our main goal is to compute or estimate the calmness modulus of the argmin mapping of linear semi-infinite optimization problems under canonical perturbations, i.e., perturbations of the objective function together with continuous perturbations of the right-hand side of the constraint system (with respect to an index ranging in a compact Hausdorff space). Specifically, we provide a lower bound on the calmness modulus for semi-infinite programs with unique optimal solution which turns out to be the exact modulus when the problem is finitely constrained. The relationship between the calmness of the argmin mapping and the same property for the (sub)level set mapping (with respect to the objective function), for semi-infinite programs and without requiring the uniqueness of the nominal solution, is explored, too, providing an upper bound on the calmness modulus of the argmin mapping. When confined to finitely constrained problems, we also provide a computable upper bound as it only relies on the nominal data and parameters, not involving elements in a neighborhood. Illustrative examples are provided.
Resumo:
There is an increasing concern to reduce the cost and overheads during the development of reliable systems. Selective protection of most critical parts of the systems represents a viable solution to obtain a high level of reliability at a fraction of the cost. In particular to design a selective fault mitigation strategy for processor-based systems, it is mandatory to identify and prioritize the most vulnerable registers in the register file as best candidates to be protected (hardened). This paper presents an application-based metric to estimate the criticality of each register from the microprocessor register file in microprocessor-based systems. The proposed metric relies on the combination of three different criteria based on common features of executed applications. The applicability and accuracy of our proposal have been evaluated in a set of applications running in different microprocessors. Results show a significant improvement in accuracy compared to previous approaches and regardless of the underlying architecture.
Resumo:
The total sea level variation (SLV) is the combination of steric and mass␣induced SLV, whose exact shares are key to understanding the oceanic response to climate system changes. Total SLV can be observed by radar altimetry satellites such as TOPEX/POSEIDON and Jason 1/2. The steric SLV can be computed through temperature and salinity profiles from in situ measurements or from ocean general circulation models (OGCM), which can assimilate the said observations. The mass-induced SLV can be estimated from its time-variable gravity (TVG) signals. We revisit this problem in the Mediterranean Sea estimating the observed, steric, and mass-induced SLV, for the latter we analyze the latest TVG data set from the GRACE (Gravity Recovery and Climate Experiment) satellite mission launched in 2002, which is 3.5 times longer than in previous studies, with the application of a two-stage anisotropic filter to reduce the noise in high-degree and -order spherical harmonic coefficients. We confirm that the intra-annual total SLV are only produced by water mass changes, a fact explained in the literature as a result of the wind field around the Gibraltar Strait. The steric SLV estimated from the residual of “altimetry minus GRACE” agrees in phase with that estimated from OGCMs and in situ measurements, although showing a higher amplitude. The net water fluxes through both the straits of Gibraltar and Sicily have also been estimated accordingly.
Resumo:
Hardware/Software partitioning (HSP) is a key task for embedded system co-design. The main goal of this task is to decide which components of an application are to be executed in a general purpose processor (software) and which ones, on a specific hardware, taking into account a set of restrictions expressed by metrics. In last years, several approaches have been proposed for solving the HSP problem, directed by metaheuristic algorithms. However, due to diversity of models and metrics used, the choice of the best suited algorithm is an open problem yet. This article presents the results of applying a fuzzy approach to the HSP problem. This approach is more flexible than many others due to the fact that it is possible to accept quite good solutions or to reject other ones which do not seem good. In this work we compare six metaheuristic algorithms: Random Search, Tabu Search, Simulated Annealing, Hill Climbing, Genetic Algorithm and Evolutionary Strategy. The presented model is aimed to simultaneously minimize the hardware area and the execution time. The obtained results show that Restart Hill Climbing is the best performing algorithm in most cases.
Resumo:
Commercial off-the-shelf microprocessors are the core of low-cost embedded systems due to their programmability and cost-effectiveness. Recent advances in electronic technologies have allowed remarkable improvements in their performance. However, they have also made microprocessors more susceptible to transient faults induced by radiation. These non-destructive events (soft errors), may cause a microprocessor to produce a wrong computation result or lose control of a system with catastrophic consequences. Therefore, soft error mitigation has become a compulsory requirement for an increasing number of applications, which operate from the space to the ground level. In this context, this paper uses the concept of selective hardening, which is aimed to design reduced-overhead and flexible mitigation techniques. Following this concept, a novel flexible version of the software-based fault recovery technique known as SWIFT-R is proposed. Our approach makes possible to select different registers subsets from the microprocessor register file to be protected on software. Thus, design space is enriched with a wide spectrum of new partially protected versions, which offer more flexibility to designers. This permits to find the best trade-offs between performance, code size, and fault coverage. Three case studies have been developed to show the applicability and flexibility of the proposal.
Resumo:
Objective. To determine the level of involvement of clinical nurses accredited by the Universitat Jaume I (Spain) as mentors of practice (Reference Nurses) in the evaluation of competence of nursing students. Methodolgy. Cross-sectional study, in which the “Clinical Practice Assessment Manual” (CPAM) reported by reference 41 nurses (n=55) were analyzed. Four quality criteria for completion were established: with information at least 80% of the required data, the presence of the signature and final grade in the right place. Verification of learning activities was also conducted. Data collection was performed concurrently reference for nurses and teachers of the subjects in the formative evaluations of clinical clerkship period in the matter “Nursing Care in Healthcare Processes “, from March to June 2013. Results. 63% of CPAM were completed correctly, without reaching the quality threshold established (80%). The absence of the signature is the main criteria of incorrect completion (21%). Nine learning activities do not meet the quality threshold set (80%) (p < 0.05). There are significant differences according to clinical units p < 0.05. From the 30 learning activities evaluated in the CPAM, it can be stated that nine of them do not reach the verification threshold established (80%), therefore it cannot be assumed that these activities had been completed by students and evaluated by the RefN throughout the clinical clerkship period. Conclusion. The level of involvement of Reference Nurse cannot be considered adequate, although strategies to encourage involvement through collaboration and training must be developed.
Resumo:
Comunicación presentada en las V Jornadas de Computación Empotrada, Valladolid, 17-19 Septiembre 2014
Resumo:
The UK construction industry comprises a very high proportion of SMEs that is companies employing up to 250. A Department for Business, Innovation and Skills research paper, found that SMEs had a 71.2% share of work in the construction industry. Micro and small firms (i.e. those employing up to 50) had a share of 46.7% of work (Ive and Murray 2013). The Government has high ambitions for UK construction. Having been found by successive government commissioned studies to be inefficient and highly fragmented, ambitious targets have been set for the industry to achieve 33% reduction in costs and 50% faster delivery by 2025. As a significant construction client, the Government has mandated the use of Level 2 BIM from 2016 on publicly funded projects over £5 million. The adoption of BIM plays a key role in the 2025 vision but a lack of clarity persists in the industry over BIM and significant barriers are perceived to its implementation, particularly amongst SMEs. However, industry wide transformation will be challenging without serious consideration of the capabilities of this large majority. Many larger firms, having implemented Level 2 BIM are now working towards Level 3 BIM while many of the smaller firms in the industry have not even heard of BIM. It would seem that fears of a ‘two tier’ industry are perhaps being realised. This paper builds on an earlier one (Mellon & Kouider 2014) and investigates, through field work, the level of Level 2 BIM implementation amongst SMEs compared to a large organisation. Challenges and innovative solutions identified through collected data are fully discussed and compared. It is suggested that where the SME perceives barriers towards adoption of the technologies which underpin BIM, they may consider collaborative methods of working as an interim step in order to work towards realising the efficiencies and benefits that these methods can yield. While the barriers to adoption of BIM are significant, it is suggested that they are not insurmountable for the SME and some recommendations for possible solutions are made.