965 resultados para Web-Assisted Error Detection
Resumo:
Este trabajo trata de la aplicación de los códigos detectores y correctores de error al diseño de los Computadores Tolerantes a Fallos, planteando varias estrategias óptimas de detección y corrección para algunos subsistemas. En primer lugar,"se justifica la necesidad de aplicar técnicas de Tolerancia a Fallos. A continuación se hacen previsiones de evolución de la tecnología de Integración, así como una tipificación de los fallos en circuitos Integrados. Partiendo de una recopilación y revisión de la teoría de códigos, se hace un desarrollo teórico cuya aplicación permite obligar a que algunos de estos códigos sean cerrados respecto de las operaciones elementales que se ejecutan en un computador. Se plantean estrategias óptimas de detección y corrección de error para sus subsistemas mas Importantes, culminando en el diseño, realización y prueba de una unidad de memoria y una unidad de proceso de datos con amplias posibilidades de detección y corrección de errores.---ABSTRACT---The present work deals with the application of error detecting and correctíng codes to the désign of Fault Tolerant Computers. Several óptimo» detection and correction strategies are presented to be applied in some subsystems. First of all, the necessity of applying Fault Tolerant techniques is explained. Later, a study on íntegration technology evolution and typification of Integrated circuit faults 1s developed. Based on a compilation and revisión of Coding Theory, a theoretical study is carried out. It allows us to force some of these codes to be closed over elementary operations. Optimum detection and correction techniques are presented for the raost important subsystems. Flnally, the design, building and testing of a memory unit and a processing unit provided with wlde error detection and correction posibilities 1s shown.
Resumo:
The concept of service oriented architecture has been extensively explored in software engineering, due to the fact that it produces architectures made up of several interconnected modules, easy to reuse when building new systems. This approach to design would be impossible without interconnection mechanisms such as REST (Representationa State Transfer) services, which allow module communication while minimizing coupling. . However, this low coupling brings disadvantages, such as the lack of transparency, which makes it difficult to sistematically create tests without knowledge of the inner working of a system. In this article, we present an automatic error detection system for REST services, based on a statistical analysis over responses produced at multiple service invocations. Thus, a service can be systematically tested without knowing its full specification. The method can find errors in REST services which could not be identified by means of traditional testing methods, and provides limited testing coverage for services whose response format is unknown. It can be also useful as a complement to other testing mechanisms.
Resumo:
Requirements for systems to continue to operate satisfactorily in the presence of faults has led to the development of techniques for the construction of fault tolerant software. This thesis addresses the problem of error detection and recovery in distributed systems which consist of a set of communicating sequential processes. A method is presented for the `a priori' design of conversations for this class of distributed system. Petri nets are used to represent the state and to solve state reachability problems for concurrent systems. The dynamic behaviour of the system can be characterised by a state-change table derived from the state reachability tree. Systematic conversation generation is possible by defining a closed boundary on any branch of the state-change table. By relating the state-change table to process attributes it ensures all necessary processes are included in the conversation. The method also ensures properly nested conversations. An implementation of the conversation scheme using the concurrent language occam is proposed. The structure of the conversation is defined using the special features of occam. The proposed implementation gives a structure which is independent of the application and is independent of the number of processes involved. Finally, the integrity of inter-process communications is investigated. The basic communication primitives used in message passing systems are seen to have deficiencies when applied to systems with safety implications. Using a Petri net model a boundary for a time-out mechanism is proposed which will increase the integrity of a system which involves inter-process communications.
Resumo:
This paper describes work carried out to develop methods of verifying that machine tools are capable of machining parts to within specification, immediately before carrying out critical material removal operations, and with negligible impact on process times. A review of machine tool calibration and verification technologies identified that current techniques were not suitable due to requirements for significant time and skilled human intervention. A 'solution toolkit' is presented consisting of a selection circular tests and artefact probing which are able to rapidly verify the kinematic errors and in some cases also dynamic errors for different types of machine tool, as well as supplementary methods for tool and spindle error detection. A novel artefact probing process is introduced which simplifies data processing so that the process can be readily automated using only the native machine tool controller. Laboratory testing and industrial case studies are described which demonstrate the effectiveness of this approach.
Resumo:
Small errors proved catastrophic. Our purpose to remark that a very small cause which escapes our notice determined a considerable effect that we cannot fail to see, and then we say that the effect is due to chance. Small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce an enormous error in the latter. When dealing with any kind of electrical device specification, it is important to note that there exists a pair of test conditions that define a test: the forcing function and the limit. Forcing functions define the external operating constraints placed upon the device tested. The actual test defines how well the device responds to these constraints. Forcing inputs to threshold for example, represents the most difficult testing because this put those inputs as close as possible to the actual switching critical points and guarantees that the device will meet the Input-Output specifications. ^ Prediction becomes impossible by classical analytical analysis bounded by Newton and Euclides. We have found that non linear dynamics characteristics is the natural state of being in all circuits and devices. Opportunities exist for effective error detection in a nonlinear dynamics and chaos environment. ^ Nowadays there are a set of linear limits established around every aspect of a digital or analog circuits out of which devices are consider bad after failing the test. Deterministic chaos circuit is a fact not a possibility as it has been revived by our Ph.D. research. In practice for linear standard informational methodologies, this chaotic data product is usually undesirable and we are educated to be interested in obtaining a more regular stream of output data. ^ This Ph.D. research explored the possibilities of taking the foundation of a very well known simulation and modeling methodology, introducing nonlinear dynamics and chaos precepts, to produce a new error detector instrument able to put together streams of data scattered in space and time. Therefore, mastering deterministic chaos and changing the bad reputation of chaotic data as a potential risk for practical system status determination. ^
Resumo:
Successful implementation of fault-tolerant quantum computation on a system of qubits places severe demands on the hardware used to control the many-qubit state. It is known that an accuracy threshold Pa exists for any quantum gate that is to be used for such a computation to be able to continue for an unlimited number of steps. Specifically, the error probability Pe for such a gate must fall below the accuracy threshold: Pe < Pa. Estimates of Pa vary widely, though Pa ∼ 10−4 has emerged as a challenging target for hardware designers. I present a theoretical framework based on neighboring optimal control that takes as input a good quantum gate and returns a new gate with better performance. I illustrate this approach by applying it to a universal set of quantum gates produced using non-adiabatic rapid passage. Performance improvements are substantial comparing to the original (unimproved) gates, both for ideal and non-ideal controls. Under suitable conditions detailed below, all gate error probabilities fall by 1 to 4 orders of magnitude below the target threshold of 10−4. After applying the neighboring optimal control theory to improve the performance of quantum gates in a universal set, I further apply the general control theory in a two-step procedure for fault-tolerant logical state preparation, and I illustrate this procedure by preparing a logical Bell state fault-tolerantly. The two-step preparation procedure is as follow: Step 1 provides a one-shot procedure using neighboring optimal control theory to prepare a physical qubit state which is a high-fidelity approximation to the Bell state |β01⟩ = 1/√2(|01⟩ + |10⟩). I show that for ideal (non-ideal) control, an approximate |β01⟩ state could be prepared with error probability ϵ ∼ 10−6 (10−5) with one-shot local operations. Step 2 then takes a block of p pairs of physical qubits, each prepared in |β01⟩ state using Step 1, and fault-tolerantly prepares the logical Bell state for the C4 quantum error detection code.
Resumo:
The elderly constitute a growing world population group, with more than 200 million people over 60 years of age. This fact has increased the detection of chronic-degenerative diseases, as well as the prescription and consumption of medicines. The elderly are particularly susceptible to adverse drug events or interactions with other drugs due to their physiological changes, genetic predisposition and environmental exposure. It becomes necessary to adapt the health systems with integral and multidisciplinary approaches suitable to this demographic change, as the knowledge about appropriate prescription, clinical pharmacology and medication use in the elderly has become essential. It has been shown that about two thirds of elderly patients receive inappropriate drug doses, and a substantial percentage of their hospital admissions are associated with potentially preventable toxic effects of drugs. To date, expert criteria, error detection tools and educational prescription plans have been developed by expert consensus for the safe use of drugs in the geriatric population. The objective of this study is a brief review of the principal physiological changes in an older adult, and summarize the contributions of the consensuses on prescription.
Resumo:
In recent years, we have witnessed great changes in the industrial environment as a result of the innovations introduced by Industry 4.0, especially in the integration of Internet of Things, Automation and Robotics in the manufacturing field. The project presented in this thesis lies within this innovation context and describes the implementation of an Image Recognition application focused on the automotive field. The project aims at helping the supply chain operator to perform an effective and efficient check of the homologation tags present on vehicles. The user contribution consists in taking a picture of the tag and the application will automatically, exploiting Amazon Web Services, return the result of the control about the correctness of the tag, the correct positioning within the vehicle and the presence of faults or defects on the tag. To implement this application we ombined two IoT platforms widely used in industrial field: Amazon Web Services(AWS) and ThingWorx. AWS exploits Convolutional Neural Networks to perform Text Detection and Image Recognition, while PTC ThingWorx manages the user interface and the data manipulation.
Resumo:
A rapid and reliable polymerase chain reaction (PCR)-based protocol was developed for detecting zygosity of the 1BL/1RS translocation in hexaploid wheat. The protocol involved a multiplex PCR with 2 pairs of oligonucleotide primers, rye-specific Ris-1 primers, and consensus 5S intergenic spacer (IGS) primers, and digestion of the PCR products with the restriction enzyme, MseI. A small piece of alkali-treated intact leaf tissue is used as a template for the PCR, thereby eliminating the necessity for DNA extraction. The test is simple, highly sensitive, and rapid compared with the other detection systems of 1BS1RS heterozygotes in hexaploid wheat. PCR results were confirmed with AFLP analyses. Diagnostic tests for 1BL/1RS translocation based on Sec-1-specific ELISA, screening for chromosome arm 1RS controlled rust resistance locus Yr9, and the PCR test differed in their ability to detect heterozygotes. The PCR test and rust test detected more heterozygotes than the ELISA test. The PCR test is being used to facilitate S1 family recurrent selection in the Germplasm Enhancement Program of the Australian Northern Wheat Improvement Program. A combination of the PCR zygosity test with other markers currently being implemented in the breeding program makes this test economical for 1BL/1RS characterisation of S1 families.
Resumo:
In this paper we present an amorphous silicon device that can be used in two operation modes to measure the concentration of ions in solution. While crystalline devices present a higher sensitivity, their amorphous counterpart present a much lower fabrication cost, thus enabling the production of cheap disposable sensors for use, for example, in the food industry. The devices were fabricated on glass substrates by the PECVD technique in the top gate configuration, where the metallic gate is replaced by an electrolytic solution with an immersed Ag/AgCl reference electrode. Silicon nitride is used as gate dielectric enhancing the sensitivity and passivation layer used to avoid leakage and electrochemical reactions. In this article we report on the semiconductor unit, showing that the device can be operated in a light-assisted mode, where changes in the pH produce changes on the measured ac photocurrent. In alternative the device can be operated as a conventional ion selective field effect device where changes in the pH induce changes in the transistor's threshold voltage.
Resumo:
A multiresidue approach using microwave-assisted extraction and liquid chromatography with photodiode array detection was investigated for the determination of butylate, carbaryl, carbofuran, chlorpropham, ethiofencarb, linuron,metobromuron, and monolinuron in soils. The critical parameters of the developed methodology were studied. Method validation was performed by analyzing freshly and aged spiked soil samples. The recoveries and relative standard deviations reached using the optimized conditions were between 77.0 ± 0.46% and 120 ± 2.9% except for ethiofencarb (46.4 ± 4.4% to 105 ± 1.6%) and butylate (22.1 ± 7.6% to 49.2 ± 11%). Soil samples from five locations of Portugal were analysed.
Resumo:
An electrochemical method is proposed for the determination of maltol in food. Microwave-assisted extraction procedures were developed to assist sample pre-treating steps. Experiments carried out in cyclic voltammetry showed an irreversible and adsorption controlled reduction of maltol. A cathodic peak was observed at -1.0 V for a Hanging Mercury Drop Electrode versus an AgCl/Ag (in saturated KCl), and the peak potential was pH independent. Square wave voltammetric procedures were selected to plot calibration curves. These procedures were carried out with the optimum conditions: pH 6.5; frequency 50 Hz; deposition potential 0.6 V; and deposition time 10 s. A linear behaviour was observed within 5.0 × 10-8 and 3.5 × 10-7 M. The proposed method was applied to the analysis of cakes, and results were compared with those obtained by an independent method. The voltammetric procedure was proven suitable for the analysis of cakes and provided environmental and economical advantages, including reduced toxicity and volume of effluents and decreased consumption of reagents.
Resumo:
Recent studies of mobile Web trends show a continuous explosion of mobile-friendly content. However, the increasing number and heterogeneity of mobile devices poses several challenges for Web programmers who want to automatically get the delivery context and adapt the content to mobile devices. In this process, the devices detection phase assumes an important role where an inaccurate detection could result in a poor mobile experience for the enduser. In this paper we compare the most promising approaches for mobile device detection. Based on this study, we present an architecture for a system to detect and deliver uniform m-Learning content to students in a Higher School. We focus mainly on the devices capabilities repository manageable and accessible through an API. We detail the structure of the capabilities XML Schema that formalizes the data within the devices capabilities XML repository and the REST Web Service API for selecting the correspondent devices capabilities data according to a specific request. Finally, we validate our approach by presenting the access and usage statistics of the mobile web interface of the proposed system such as hits and new visitors, mobile platforms, average time on site and rejection rate.
Resumo:
This research aims to advance blinking detection in the context of work activity. Rather than patients having to attend a clinic, blinking videos can be acquired in a work environment, and further automatically analyzed. Therefore, this paper presents a methodology to perform the automatic detection of eye blink using consumer videos acquired with low-cost web cameras. This methodology includes the detection of the face and eyes of the recorded person, and then it analyzes the low-level features of the eye region to create a quantitative vector. Finally, this vector is classified into one of the two categories considered —open and closed eyes— by using machine learning algorithms. The effectiveness of the proposed methodology was demonstrated since it provides unbiased results with classification errors under 5%
Resumo:
In the past, research in ontology learning from text has mainly focused on entity recognition, taxonomy induction and relation extraction. In this work we approach a challenging research issue: detecting semantic frames from texts and using them to encode web ontologies. We exploit a new generation Natural Language Processing technology for frame detection, and we enrich the frames acquired so far with argument restrictions provided by a super-sense tagger and domain specializations. The results are encoded according to a Linguistic MetaModel, which allows a complete translation of lexical resources and data acquired from text, enabling custom transformations of the enriched frames into modular ontology components.