897 resultados para fault correction
Resumo:
While the robots gradually become a part of our daily lives, they already play vital roles in many critical operations. Some of these critical tasks include surgeries, battlefield operations, and tasks that take place in hazardous environments or distant locations such as space missions. In most of these tasks, remotely controlled robots are used instead of autonomous robots. This special area of robotics is called teleoperation. Teleoperation systems must be reliable when used in critical tasks; hence, all of the subsystems must be dependable even under a subsystem or communication line failure. These systems are categorized as unilateral or bilateral teleoperation. A special type of bilateral teleoperation is described as force-reflecting teleoperation, which is further investigated as limited- and unlimited-workspace teleoperation. Teleoperation systems configured in this study are tested both in numerical simulations and experiments. A new method, Virtual Rapid Robot Prototyping, is introduced to create system models rapidly and accurately. This method is then extended to configure experimental setups with actual master systems working with system models of the slave robots accompanied with virtual reality screens as well as the actual slaves. Fault-tolerant design and modeling of the master and slave systems are also addressed at different levels to prevent subsystem failure. Teleoperation controllers are designed to compensate for instabilities due to communication time delays. Modifications to the existing controllers are proposed to configure a controller that is reliable in communication line failures. Position/force controllers are also introduced for master and/or slave robots. Later, controller architecture changes are discussed in order to make these controllers dependable even in systems experiencing communication problems. The customary and proposed controllers for teleoperation systems are tested in numerical simulations on single- and multi-DOF teleoperation systems. Experimental studies are then conducted on seven different systems that included limited- and unlimited-workspace teleoperation to verify and improve simulation studies. Experiments of the proposed controllers were successful relative to the customary controllers. Overall, by employing the fault-tolerance features and the proposed controllers, a more reliable teleoperation system is possible to design and configure which allows these systems to be used in a wider range of critical missions.
Resumo:
Respiratory gating in lung PET imaging to compensate for respiratory motion artifacts is a current research issue with broad potential impact on quantitation, diagnosis and clinical management of lung tumors. However, PET images collected at discrete bins can be significantly affected by noise as there are lower activity counts in each gated bin unless the total PET acquisition time is prolonged, so that gating methods should be combined with imaging-based motion correction and registration methods. The aim of this study was to develop and validate a fast and practical solution to the problem of respiratory motion for the detection and accurate quantitation of lung tumors in PET images. This included: (1) developing a computer-assisted algorithm for PET/CT images that automatically segments lung regions in CT images, identifies and localizes lung tumors of PET images; (2) developing and comparing different registration algorithms which processes all the information within the entire respiratory cycle and integrate all the tumor in different gated bins into a single reference bin. Four registration/integration algorithms: Centroid Based, Intensity Based, Rigid Body and Optical Flow registration were compared as well as two registration schemes: Direct Scheme and Successive Scheme. Validation was demonstrated by conducting experiments with the computerized 4D NCAT phantom and with a dynamic lung-chest phantom imaged using a GE PET/CT System. Iterations were conducted on different size simulated tumors and different noise levels. Static tumors without respiratory motion were used as gold standard; quantitative results were compared with respect to tumor activity concentration, cross-correlation coefficient, relative noise level and computation time. Comparing the results of the tumors before and after correction, the tumor activity values and tumor volumes were closer to the static tumors (gold standard). Higher correlation values and lower noise were also achieved after applying the correction algorithms. With this method the compromise between short PET scan time and reduced image noise can be achieved, while quantification and clinical analysis become fast and precise.
Resumo:
This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.
Resumo:
Rapid development in industry have contributed to more complex systems that are prone to failure. In applications where the presence of faults may lead to premature failure, fault detection and diagnostics tools are often implemented. The goal of this research is to improve the diagnostic ability of existing FDD methods. Kernel Principal Component Analysis has good fault detection capability, however it can only detect the fault and identify few variables that have contribution on occurrence of fault and thus not precise in diagnosing. Hence, KPCA was used to detect abnormal events and the most contributed variables were taken out for more analysis in diagnosis phase. The diagnosis phase was done in both qualitative and quantitative manner. In qualitative mode, a networked-base causality analysis method was developed to show the causal effect between the most contributing variables in occurrence of the fault. In order to have more quantitative diagnosis, a Bayesian network was constructed to analyze the problem in probabilistic perspective.
Resumo:
A near-bottom geological and geophysical survey was conducted at the western intersection of the Siqueiros Transform Fault and the East Pacific Rise. Transform-fault shear appears to distort the east flank of the rise crest in an area north of the fracture zone. In ward-facing scarps trend 335° and do not parallel the regional axis of spreading. Small-scale scarps reveal a hummocky bathymetry. The center of spreading is not a central peak but rather a 20-40 m deep, 1 km wide valley superimposed upon an 8 km wide ridge-crest horst. Small-scale topography indicates widespread volcanic flows within the valley. Two 0.75 km wide blocks flank the central valley. Fault scarps are more dominant on the western flank. Their alignment shifts from directions intermediate to parallel to the regional axis of spreading (355°). A median ridge within the fracture zone has a fault-block topography similar to that of the East Pacific Rise to the north. Dominant eastward-facing scarps trending 335° are on the west flank. A central depression, 1 km wide and 30 m deep, separates the dominantly fault-block regime of the west from the smoother topography of the east flank. This ridge originated by uplift due to faulting as well as by volcanism. Detailed mapping was concentrated in a perched basin (Dante's Hole) at the intersection of the rise crest and the fracture zone. Structural features suggest that Dante's Hole is an area subject to extreme shear and tensional drag resulting from transition between non-rigid and rigid crustal behavior. Normal E-W crustal spreading is probably taking place well within the northern confines of the basin. Possible residual spreading of this isolated rise crest coupled with shear drag within the transform fault could explain the structural isolation of Dante's Hole from the remainder of the Siqueiros Transform Fault.
Resumo:
Topological quantum error correction codes are currently among the most promising candidates for efficiently dealing with the decoherence effects inherently present in quantum devices. Numerically, their theoretical error threshold can be calculated by mapping the underlying quantum problem to a related classical statistical-mechanical spin system with quenched disorder. Here, we present results for the general fault-tolerant regime, where we consider both qubit and measurement errors. However, unlike in previous studies, here we vary the strength of the different error sources independently. Our results highlight peculiar differences between toric and color codes. This study complements previous results published in New J. Phys. 13, 083006 (2011).
Resumo:
This letter presents an FPGA implementation of a fault-tolerant Hopfield NeuralNetwork (HNN). The robustness of this circuit against Single Event Upsets (SEUs) and Single Event Transients (SETs) has been evaluated. Results show the fault tolerance of the proposed design, compared to a previous non fault- tolerant implementation and a solution based on triple modular redundancy (TMR) of a standard HNN design.
Resumo:
This study was developed under the ExxonMobil FC2 Alliance (Fundamental Controls on Flow in Carbonates). The authors wish to thank ExxonMobil Production Company and ExxonMobil Upstream Research Company for providing funding. The views in this article by Sherry L. Stafford are her own and not necessarily those of ExxonMobil. This research was supported by the Sedimentary Geology Research Group of the Generalitat de Catalunya (2014SGR251). We would like to thank Andrea Ceriani and Paola Ronchi for their critical and valuable reviews, and Associated Editor Piero Gianolla for the editorial work.
Resumo:
Peer reviewed
Resumo:
Major funding was provided by the UK Natural Environment Research Council (NERC) under grant NE/I028017/1 and partially supported by Boğaziçi University Research Fund (BAP) under grant 6922. We would like to thank all the project members from the University of Leeds, Boğaziçi University, Kandilli Observatory, Aberdeen University and Sakarya University. I would also like to thank Prof. Ali Pinar and Dr. Kıvanç Kekovalı for their valuable comments. Some of the figures were generated by GMT software (Wessel and Smith, 1995).
Resumo:
Peer reviewed
Resumo:
Major funding was provided by the UK Natural Environment Research Council (NERC) under grant NE/I028017/1 and partially supported by Boğaziçi University Research Fund (BAP) under grant 6922. We would like to thank all the project members from the University of Leeds, Boğaziçi University, Kandilli Observatory, Aberdeen University and Sakarya University. I would also like to thank Prof. Ali Pinar and Dr. Kıvanç Kekovalı for their valuable comments. Some of the figures were generated by GMT software (Wessel and Smith, 1995).
Resumo:
Acknowledgements: The authors would like to thank Total E&P and BG Group for project funding and support, and the Industry Technology Facilitator for facilitating the collaborative development (grant number 3322PSD). The authors would also like to express their gratitude to the Aberdeen Formation Evaluation Society and the College of Physical Sciences at the University of Aberdeen for partial financial support. Raymi Castilla (Total E&P), Fabrizio Agosta and Cathy Hollis are also thanked for their constructive comments and suggestions to improve the standard of this manuscript as are John Still and Colin Taylor (University of Aberdeen) for technical assistance in the laboratory. Piero Gianolla is thanked for his editorial handling of the manuscript.
Resumo:
Peer reviewed
Resumo:
ACKNOWLEDGMENT We are thankful to RTE for financial support of this project.