51 resultados para Equipment Failure Analysis
em Universidad Politécnica de Madrid
Resumo:
Non-failure analysis aims at inferring that predicate calis in a program will never fail. This type of information has many applications in functional/logic programming. It is essential for determining lower bounds on the computational cost of calis, useful in the context of program parallelization, instrumental in partial evaluation and other program transformations, and has also been used in query optimization. In this paper, we re-cast the non-failure analysis proposed by Debray et al. as an abstract interpretation, which not only allows to investígate it from a standard and well understood theoretical framework, but has also several practical advantages. It allows us to incorpórate non-failure analysis into a standard, generic abstract interpretation engine. The analysis thus benefits from the fixpoint propagation algorithm, which leads to improved information propagation. Also, the analysis takes advantage of the multi-variance of the generic engine, so that it is now able to infer sepárate non-failure information for different cali patterns. Moreover, the implementation is simpler, and allows to perform non-failure and covering analyses alongside other analyses, such as those for modes and types, in the same framework. Finally, besides the precisión improvements and the additional simplicity, our implementation (in the Ciao/CiaoPP multiparadigm programming system) also shows better efRciency.
Resumo:
We provide a method whereby, given mode and (upper approximation) type information, we can detect procedures and goals that can be guaranteed to not fail (i.e., to produce at least one solution or not termínate). The technique is based on an intuitively very simple notion, that of a (set of) tests "covering" the type of a set of variables. We show that the problem of determining a covering is undecidable in general, and give decidability and complexity results for the Herbrand and linear arithmetic constraint systems. We give sound algorithms for determining covering that are precise and efiicient in practice. Based on this information, we show how to identify goals and procedures that can be guaranteed to not fail at runtime. Applications of such non-failure information include programming error detection, program transiormations and parallel execution optimization, avoiding speculative parallelism and estimating lower bounds on the computational costs of goals, which can be used for granularity control. Finally, we report on an implementation of our method and show that better results are obtained than with previously proposed approaches.
Resumo:
Three broken steel bars from a sewed crack in a dam are reported. The inspection of the fracture surfaces of the prestressed bars suggests that fractures were triggered by small cracks and by the inherent brittleness of the bars, as fracture toughness was about 40 MPa m1/2. The analysis of the failures shows that the usual design requirements for prestressing bars fail to warn against brittle failures if some damage exists. Some recommendations, based on the concept of damage tolerance, are suggested to avoid similar unfortunate incidents.
Resumo:
In this work the failure analysis carried out in III-V concentrator multijunction solar cells after a temperature accelerated life test is presented. All the failures appeared have been catastrophic since all the solar cells turned into low shunt resistances. A case study in failure analysis based on characterization by optical microscope, SEM, EDX, EQE and XPS is presented in this paper, revealing metal deterioration in the bus bar and fingers as well as cracks in the semiconductor structure beneath or next to the bus bar. In fact, in regions far from the bus bar the semiconductor structure seems not to be damaged. SEM images have dismissed the presence of metal spikes inside the solar cell structure. Therefore, we think that for these particular solar cells, failures appear mainly as a consequence of a deficient electrolytic growth of the front metallization which also results in failures in the semiconductor structure close to the bus bars.
Resumo:
GRC is a cementitious composite material made up of a cement mortar matrix and chopped glass fibers. Due to its outstanding mechanical properties, GRC has been widely used to produce cladding panels and some civil engineering elements. Impact failure of cladding panels made of GRC may occur during production if some tool falls onto the panel, due to stone or other objects impacting at low velocities or caused by debris projected after a blast. Impact failure of a front panel of a building may have not only an important economic value but also human lives may be at risk if broken pieces of the panel fall from the building to the pavement. Therefore, knowing GRC impact strength is necessary to prevent economic costs and putting human lives at risk. One-stage light gas gun is an impact test machine capable of testing different materials subjected to impact loads. An experimental program was carried out, testing GRC samples of five different formulations, commonly used in building industry. Steel spheres were shot at different velocities on square GRC samples. The residual velocity of the projectiles was obtained both using a high speed camera with multiframe exposure and measuring the projectile’s penetration depth in molding clay blocks. Tests were performed on young and artificially aged GRC samples to compare GRC’s behavior when subjected to high strain rates. Numerical simulations using a hydrocode were made to analyze which parameters are most important during an impact event. GRC impact strength was obtained from test results. Also, GRC’s embrittlement, caused by GRC aging, has no influence on GRC impact behavior due to the small size of the projectile. Also, glass fibers used in GRC production only maintain GRC panels’ integrity but have no influence on GRC’s impact strength. Numerical models have reproduced accurately impact tests.
Resumo:
A temperature accelerated life test on concentrator lattice mismatched Ga0.37In0.63P/Ga0.83In0.17As/Ge triple-junction solar cells-on-carrier is being carried out. The solar cells have been tested at three different temperatures: 125, 145 and 165°C and the nominal photo-current condition (500X) is emulated by injecting current in darkness. The final objective of these tests is to evaluate the reliability, warranty period, and failure mechanism of these solar cells in a moderate period of time. Up to now only the test at 165°C has finished. Therefore, we cannot provide complete reliability information, but we have carried out preliminary data and failure analysis with the current results.
Resumo:
In this work, robustness and stability of continuum damage models applied to material failure in soft tissues are addressed. In the implicit damage models equipped with softening, the presence of negative eigenvalues in the tangent elemental matrix degrades the condition number of the global matrix, leading to a reduction of the computational performance of the numerical model. Two strategies have been adapted from literature to improve the aforementioned computational performance degradation: the IMPL-EX integration scheme [Oliver,2006], which renders the elemental matrix contribution definite positive, and arclength-type continuation methods [Carrera,1994], which allow to capture the unstable softening branch in brittle ruptures. The IMPL-EX integration scheme has as a major drawback the need to use small time steps to keep numerical error below an acceptable value. A convergence study, limiting the maximum allowed increment of internal variables in the damage model, is presented. Finally, numerical simulation of failure problems with fibre reinforced materials illustrates the performance of the adopted methodology.
Resumo:
This paper proposes a method for the identification of different partial discharges (PDs) sources through the analysis of a collection of PD signals acquired with a PD measurement system. This method, robust and sensitive enough to cope with noisy data and external interferences, combines the characterization of each signal from the collection, with a clustering procedure, the CLARA algorithm. Several features are proposed for the characterization of the signals, being the wavelet variances, the frequency estimated with the Prony method, and the energy, the most relevant for the performance of the clustering procedure. The result of the unsupervised classification is a set of clusters each containing those signals which are more similar to each other than to those in other clusters. The analysis of the classification results permits both the identification of different PD sources and the discrimination between original PD signals, reflections, noise and external interferences. The methods and graphical tools detailed in this paper have been coded and published as a contributed package of the R environment under a GNU/GPL license.
Resumo:
In this work, robustness and stability of continuum damage models applied to material failure in soft tissues are addressed. In the implicit damage models equipped with softening, the presence of negative eigenvalues in the tangent elemental matrix degrades the condition number of the global matrix, leading to a reduction of the computational performance of the numerical model. Two strategies have been adapted from literature to improve the aforementioned computational performance degradation: the IMPL-EX integration scheme [Oliver,2006], which renders the elemental matrix contribution definite positive, and arclength-type continuation methods [Carrera,1994], which allow to capture the unstable softening branch in brittle ruptures. The IMPL-EX integration scheme has as a major drawback the need to use small time steps to keep numerical error below an acceptable value. A convergence study, limiting the maximum allowed increment of internal variables in the damage model, is presented. Finally, numerical simulation of failure problems with fibre reinforced materials illustrates the performance of the adopted methodology.
Resumo:
The road to the automation of the agricultural processes passes through the safe operation of the autonomous vehicles. This requirement is a fact in ground mobile units, but it still has not well defined for the aerial robots (UAVs) mainly because the normative and legislation are quite diffuse or even inexistent. Therefore, to define a common and global policy is the challenge to tackle. This characterization has to be addressed from the field experience. Accordingly, this paper presents the work done in this direction, based on the analysis of the most common sources of hazards when using UAV's for agricultural tasks. The work, based on the ISO 31000 normative, has been carried out by applying a three-step structure that integrates the identification, assessment and reduction procedures. The present paper exposes how this method has been applied to analyze previous accidents and malfunctions during UAV operations in order to obtain real failure causes. It has allowed highlighting common risks and hazardous sources and proposing specific guards and safety measures for the agricultural context.
Resumo:
Steam Generator Tube Rupture (SGTR) sequences in Pressurized Water Reactors are known to be one of the most demanding transients for the operating crew. SGTR are a special kind of transient as they could lead to radiological releases without core damage or containment failure, as they can constitute a direct path from the reactor coolant system to the environment. The first methodology used to perform the Deterministic Safety Analysis (DSA) of a SGTR did not credit the operator action for the first 30 min of the transient, assuming that the operating crew was able to stop the primary to secondary leakage within that period of time. However, the different real SGTR accident cases happened in the USA and over the world demonstrated that the operators usually take more than 30 min to stop the leakage in actual sequences. Some methodologies were raised to overcome that fact, considering operator actions from the beginning of the transient, as it is done in Probabilistic Safety Analysis. This paper presents the results of comparing different assumptions regarding the single failure criteria and the operator action taken from the most common methodologies included in the different Deterministic Safety Analysis. One single failure criteria that has not been analysed previously in the literature is proposed and analysed in this paper too. The comparison is done with a PWR Westinghouse three loop model in TRACE code (Almaraz NPP) with best estimate assumptions but including deterministic hypothesis such as single failure criteria or loss of offsite power. The behaviour of the reactor is quite diverse depending on the different assumptions made regarding the operator actions. On the other hand, although there are high conservatisms included in the hypothesis, as the single failure criteria, all the results are quite far from the regulatory limits. In addition, some improvements to the Emergency Operating Procedures to minimize the offsite release from the damaged SG in case of a SGTR are outlined taking into account the offsite dose sensitivity results.
Resumo:
This paper proposes a method for the identification of different partial discharges (PDs) sources through the analysis of a collection of PD signals acquired with a PD measurement system. This method, robust and sensitive enough to cope with noisy data and external interferences, combines the characterization of each signal from the collection, with a clustering procedure, the CLARA algorithm. Several features are proposed for the characterization of the signals, being the wavelet variances, the frequency estimated with the Prony method, and the energy, the most relevant for the performance of the clustering procedure. The result of the unsupervised classification is a set of clusters each containing those signals which are more similar to each other than to those in other clusters. The analysis of the classification results permits both the identification of different PD sources and the discrimination between original PD signals, reflections, noise and external interferences. The methods and graphical tools detailed in this paper have been coded and published as a contributed package of the R environment under a GNU/GPL license.
Resumo:
After construction of the LYSS (Light cYcling Stressing Source) in early 2014, several CPV receivers, with and without secondary optical element (SOE) have been aged under fast transient illumination cycling,. The test plan for Madrid consisted of 50000 cycles. Receivers with poor heat spreaders showed low reliability but those with thicker metal layers passed the test well. The operation of LYSS along 8 months, after more than 250000 cycles, did not show any significant failure, except lamp reposition every 120 hours, in average. The equipment seems valid for unveiling weak receiver designs with respect to intensive illumination, in steady and transient modes.
Resumo:
The evolution of the television market is led by 3DTV technology, and this tendency can accelerate during the next years according to expert forecasts. However, 3DTV delivery by broadcast networks is not currently developed enough, and acts as a bottleneck for the complete deployment of the technology. Thus, increasing interest is dedicated to ste-reo 3DTV formats compatible with current HDTV video equipment and infrastructure, as they may greatly encourage 3D acceptance. In this paper, different subsampling schemes for HDTV compatible transmission of both progressive and interlaced stereo 3DTV are studied and compared. The frequency characteristics and preserved frequency content of each scheme are analyzed, and a simple interpolation filter is specially designed. Finally, the advantages and disadvantages of the different schemes and filters are evaluated through quality testing on several progressive and interlaced video sequences.
Resumo:
Although 3DTV has led the evolution of television market, its delivery by broadcast networks is still small. Now, 3DTV transmis-sions are usually done by combining both views into one common frame (side by side) to be able to use standard HDTV transmission equipment. Today, orthogonal subsampling is mostly used, but other alternatives will appear soon. Here, different subsampling schemes for both progressive and interlaced 3DTV are considered. For each possible scheme, its pre-served frequency content is analyzed and a simple interpolation filter is designed. The analysis is carried out for progressive and interlaced video and the designed filters are applied on different sequences, showing the advantages and disadvantages of the different options