956 resultados para embedded, system, entropy, pool, TRNG, random, ADC
Resumo:
The adaptation of the Spanish University to the European Higher Education Area (EEES in Spanish) demands the integration of new tools and skills that would make the teaching- learning process easier. This adaptation involves a change in the evaluation methods, which goes from a system where the student was evaluated with a final exam, to a new system where we include a continuous evaluation in which the final exam may represent at most 50% in the vast majority of the Universities. Devising a new and fair continuous evaluation system is not an easy task to do. That would mean a student’s’ learning process follow-up by the teachers, and as a consequence an additional workload on existing staff resources. Traditionally, the continuous evaluation is associated with the daily work of the student and a collection of the different marks partly or entirely based on the work they do during the academic year. Now, small groups of students and an attendance control are important aspects to take into account in order to get an adequate assessment of the students. However, most of the university degrees have groups with more than 70 students, and the attendance control is a complicated task to perform, mostly because it consumes significant amounts of staff time. Another problem found is that the attendance control would encourage not-interested students to be present at class, which might cause some troubles to their classmates. After a two year experience in the development of a continuous assessment in Statistics subjects in Social Science degrees, we think that individual and periodical tasks are the best way to assess results. These tasks or examinations must be done in classroom during regular lessons, so we need an efficient system to put together different and personal questions in order to prevent students from cheating. In this paper we provide an efficient and effective way to elaborate random examination papers by using Sweave, a tool that generates data, graphics and statistical calculus from the software R and shows results in PDF documents created by Latex. In this way, we will be able to design an exam template which could be compiled in order to generate as many PDF documents as it is required, and at the same time, solutions are provided to easily correct them.
Resumo:
Commercial off-the-shelf microprocessors are the core of low-cost embedded systems due to their programmability and cost-effectiveness. Recent advances in electronic technologies have allowed remarkable improvements in their performance. However, they have also made microprocessors more susceptible to transient faults induced by radiation. These non-destructive events (soft errors), may cause a microprocessor to produce a wrong computation result or lose control of a system with catastrophic consequences. Therefore, soft error mitigation has become a compulsory requirement for an increasing number of applications, which operate from the space to the ground level. In this context, this paper uses the concept of selective hardening, which is aimed to design reduced-overhead and flexible mitigation techniques. Following this concept, a novel flexible version of the software-based fault recovery technique known as SWIFT-R is proposed. Our approach makes possible to select different registers subsets from the microprocessor register file to be protected on software. Thus, design space is enriched with a wide spectrum of new partially protected versions, which offer more flexibility to designers. This permits to find the best trade-offs between performance, code size, and fault coverage. Three case studies have been developed to show the applicability and flexibility of the proposal.
Resumo:
Comunicación presentada en CIDUI 2010, Congreso Internacional Docencia Universitaria e Innovación, Barcelona, 30 junio-2 julio 2010.
Resumo:
Information technologies (IT) currently represent 2% of CO2 emissions. In recent years, a wide variety of IT solutions have been proposed, focused on increasing the energy efficiency of network data centers. Monitoring is one of the fundamental pillars of these systems, providing the information necessary for adequate decision making. However, today’s monitoring systems (MSs) are partial, specific and highly coupled solutions. This study proposes a model for monitoring data centers that serves as a basis for energy saving systems, offered as a value-added service embedded in a device with low cost and power consumption. The proposal is general in nature, comprehensive, scalable and focused on heterogeneous environments, and it allows quick adaptation to the needs of changing and dynamic environments. Further, a prototype of the system has been implemented in several devices, which has allowed validation of the proposal in addition to identification of the minimum hardware profile required to support the model.
Resumo:
This thesis presents the process of conducting the inventory of the old tiles collection of the Faculty of Fine Arts, University of Lisbon (FBAUL). This set can be divided into two major groups: the first is integrated in the building, the second consists of a set 2036 loose tiles with a pattern of decorative, ornamental and figuratively, some of which form panels of great value. Due to the existence of a wide variety of unknown provenance tile, stored at random, we feel the need to develop an inventory process, intended to safeguard and preserve these he-ritage objects whose existence was virtually unknown until the beginning of this work. This process continued working methodology started with the identification, photographic survey and labeling, with subsequent filling an inventory sheet. To obtain information about the loose tiles, it was essential to have a previous cleaning the mortar that prevented the reading of existing information in masonry, a process developed with the support of un-dergraduate students. After completion of the above process, we make the assembly of panels existing mostly very fragmented to give some iconographic references. In this process we identified 21 types of patterns belonging to the seventeenth and eighteenth centuries and 30 figurative and orna-mental panels. We realized then interconnections between them and the sets placed in situ around the building, and some of its tiles have been used to fill spaces or gaps. At the same time, we have created the inventory records, diagnosis and intervention, as well as a database for internal consultation - Excel - organized by a filtering system to allow quick search of all the tiles present in FBAUL. Finally, we will show a room to house the collection of loose tiles, making references to the ideal conditions of the outdoor environment and its packaging. Also we propose a very punc-tual removing some tiles embedded in the walls of the building which form part of panels composed of loose tiles, indicating a proposal to replace the other coherent and complete element belonging to the collection, preferably with a standard reason
Resumo:
With the development of the embedded application and driving assistance systems, it becomes relevant to develop parallel mechanisms in order to check and to diagnose these new systems. In this thesis we focus our research on one of this type of parallel mechanisms and analytical redundancy for fault diagnosis of an automotive suspension system. We have considered a quarter model car passive suspension model and used a parameter estimation, ARX model, method to detect the fault happening in the damper and spring of system. Moreover, afterward we have deployed a neural network classifier to isolate the faults and identifies where the fault is happening. Then in this regard, the safety measurements and redundancies can take into the effect to prevent failure in the system. It is shown that The ARX estimator could quickly detect the fault online using the vertical acceleration and displacement sensor data which are common sensors in nowadays vehicles. Hence, the clear divergence is the ARX response make it easy to deploy a threshold to give alarm to the intelligent system of vehicle and the neural classifier can quickly show the place of fault occurrence.
Resumo:
We introduce a novel way of measuring the entropy of a set of values undergoing changes. Such a measure becomes useful when analyzing the temporal development of an algorithm designed to numerically update a collection of values such as artificial neural network weights undergoing adjustments during learning. We measure the entropy as a function of the phase-space of the values, i.e. their magnitude and velocity of change, using a method based on the abstract measure of entropy introduced by the philosopher Rudolf Carnap. By constructing a time-dynamic two-dimensional Voronoi diagram using Voronoi cell generators with coordinates of value- and value-velocity (change of magnitude), the entropy becomes a function of the cell areas. We term this measure teleonomic entropy since it can be used to describe changes in any end-directed (teleonomic) system. The usefulness of the method is illustrated when comparing the different approaches of two search algorithms, a learning artificial neural network and a population of discovering agents. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Foreign exchange trading has emerged recently as a significant activity in many countries. As with most forms of trading, the activity is influenced by many random parameters so that the creation of a system that effectively emulates the trading process will be very helpful. A major issue for traders in the deregulated Foreign Exchange Market is when to sell and when to buy a particular currency in order to maximize profit. This paper presents novel trading strategies based on the machine learning methods of genetic algorithms and reinforcement learning.
Resumo:
Consider a network of unreliable links, modelling for example a communication network. Estimating the reliability of the network-expressed as the probability that certain nodes in the network are connected-is a computationally difficult task. In this paper we study how the Cross-Entropy method can be used to obtain more efficient network reliability estimation procedures. Three techniques of estimation are considered: Crude Monte Carlo and the more sophisticated Permutation Monte Carlo and Merge Process. We show that the Cross-Entropy method yields a speed-up over all three techniques.
Resumo:
Based on morphological features alone, there is considerable difficulty in identifying the 5 most economically damaging weed species of Sporobolus [ viz. S. pyramidalis P. Beauv., S. natalensis ( Steud.) Dur and Schinz, S. fertilis ( Steud.) Clayton, S. africanus (Poir.) Robyns and Tourney, and S. jacquemontii Kunth.] found in Australia. A polymerase chain reaction (PCR)-based random amplified polymorphic DNA ( RAPD) technique was used to create a series of genetic markers that could positively identify the 5 major weeds from the other less damaging weedy and native Sporobolus species. In the initial RAPD pro. ling experiment, using arbitrarily selected primers and involving 12 species of Sporobolus, 12 genetic markers were found that, when used in combination, could consistently identify the 5 weedy species from all others. Of these 12 markers, the most diagnostic were UBC51(490) for S. pyramidalis and S. natalensis; UBC43(310,2000,2100) for S. fertilis and S. africanus; and OPA20(850) and UBC43(470) for S. jacquemontii. Species-specific markers could be found only for S. jacquemontii. In an effort to understand why there was difficulty in obtaining species-specific markers for some of the weedy species, a RAPD data matrix was created using 40 RAPD products. These 40 products amplified by 6 random primers from 45 individuals belonging to 12 species, were then subjected to numerical taxonomy and multivariate system (NTSYS pc version 1.70) analysis. The RAPD similarity matrix generated from the analysis indicated that S. pyramidalis was genetically more similar to S. natalensis than to other species of the 'S. indicus complex'. Similarly, S. jacquemontii was more similar to S. pyramidalis, and S. fertilis was more similar to S. africanus than to other species of the complex. Sporobolus pyramidalis, S. jacquemontii, S. africanus, and S. creber exhibited a low within-species genetic diversity, whereas high genetic diversity was observed within S. natalensis, S. fertilis, S. sessilis, S. elongates, and S. laxus. Cluster analysis placed all of the introduced species ( major and minor weedy species) into one major cluster, with S. pyramidalis and S. natalensis in one distinct subcluster and S. fertilis and S. africanus in another. The native species formed separate clusters in the phenograms. The close genetic similarity of S. pyramidalis to S. natalensis, and S. fertilis to S. africanus may explain the difficulty in obtaining RAPD species-specific markers. The importance of these results will be within the Australian dairy and beef industries and will aid in the development of integrated management strategy for these weeds.
Resumo:
We consider the problem of estimating P(Yi + (...) + Y-n > x) by importance sampling when the Yi are i.i.d. and heavy-tailed. The idea is to exploit the cross-entropy method as a toot for choosing good parameters in the importance sampling distribution; in doing so, we use the asymptotic description that given P(Y-1 + (...) + Y-n > x), n - 1 of the Yi have distribution F and one the conditional distribution of Y given Y > x. We show in some specific parametric examples (Pareto and Weibull) how this leads to precise answers which, as demonstrated numerically, are close to being variance minimal within the parametric class under consideration. Related problems for M/G/l and GI/G/l queues are also discussed.
Resumo:
Aim. The paper presents a study assessing the rate of adoption of a sedation scoring system and sedation guideline. Background. Clinical practice guidelines including sedation guidelines have been shown to improve patient outcomes by standardizing care. In particular sedation guidelines have been shown to be beneficial for intensive care patients by reducing the duration of ventilation. Despite the acceptance that clinical practice guidelines are beneficial, adoption rates are rarely measured. Adoption data may reveal other factors which contribute to improved outcomes. Therefore, the usefulness of the guideline may be more appropriately assessed by collecting adoption data. Method. A quasi-experimental pre-intervention and postintervention quality improvement design was used. Adoption was operationalized as documentation of sedation score every 4 hours and use of the sedation and analgesic medications suggested in the guideline. Adoption data were collected from patients' charts on a random day of the month; all patients in the intensive care unit on that day were assigned an adoption category. Sedation scoring system adoption data were collected before implementation of a sedation guideline, which was implemented using an intensive information-giving strategy, and guideline adoption data were fed back to bedside nurses. After implementation of the guideline, adoption data were collected for both the sedation scoring system and the guideline. The data were collected in the years 2002-2004. Findings. The sedation scoring system was not used extensively in the pre-intervention phase of the study; however, this improved in the postintervention phase. The findings suggest that the sedation guideline was gradually adopted following implementation in the postintervention phase of the study. Field notes taken during the implementation of the sedation scoring system and the guideline reveal widespread acceptance of both. Conclusion. Measurement of adoption is a complex process. Appropriate operationalization contributes to greater accuracy. Further investigation is warranted to establish the intensity and extent of implementation required to positively affect patient outcomes.
Resumo:
We consider a problem of robust performance analysis of linear discrete time varying systems on a bounded time interval. The system is represented in the state-space form. It is driven by a random input disturbance with imprecisely known probability distribution; this distributional uncertainty is described in terms of entropy. The worst-case performance of the system is quantified by its a-anisotropic norm. Computing the anisotropic norm is reduced to solving a set of difference Riccati and Lyapunov equations and a special form equation.
Resumo:
We consider the problems of computing the power and exponential moments EXs and EetX of square Gaussian random matrices X=A+BWC for positive integer s and real t, where W is a standard normal random vector and A, B, C are appropriately dimensioned constant matrices. We solve the problems by a matrix product scalarization technique and interpret the solutions in system-theoretic terms. The results of the paper are applicable to Bayesian prediction in multivariate autoregressive time series and mean-reverting diffusion processes.