981 resultados para Root cause analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: The aim of this study was to assess the occurrence of apical root transportation after the use of Pro Taper Universal rotary files sizes 3 (F3) and 4 (F4). Methods: Instruments were worked to the apex of the original canal, always by the same operator. Digital subtraction radiography images were produced in buccolingual and mesiodistal projections. A total of 25 radiographs were taken from root canals of human maxillary first molars with curvatures varying from 23-31 degrees. Quantitative data were analyzed by intraclass correlation coefficient and Wilcoxon nonparametric test (P = .05). Results: Buccolingual images revealed a significantly higher degree of apical transportation associated with F4 instruments when compared with F3 instruments in relation to the original canal (Wilcoxon test, P = .007). No significant difference was observed in mesiodistal images (P = .492). Conclusions: F3 instruments should be used with care in curved canals, and F4 instruments should be avoided in apical third preparation of curved canals. (J Endod 2010;36:1052-1055)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to analyze, under scanning electron microscopy (SEM), the morphologic characteristics of root surfaces after application of CarisolvTM gel in association with scaling and root planing (SRP). Sixty periodontally compromised extracted human teeth were randomly assigned to 6 groups: 1) SRP alone; 2) passive topical application of CarisolvTM + SRP; 3) active topical application of CarisolvTM + SRP; 4) multiple applications of CarisolvTM + SRP; 5) SRP + 24% EDTA; 6) topical application of CarisolvTM + SRP + 24% EDTA. CarisolvTM gel was applied to root surfaces for 30 s, followed by scaling and root planing, consisting of 50 strokes with Gracey curettes in an apical-coronal direction, parallel to the long axis of the tooth. The only exception was group 4, in which the roots were instrumented until a smooth, hard and glass-like surface was achieved. All specimens were further analyzed by SEM. The results showed that the treatment with CarisolvTM caused significant changes in root surface morphology of periodontally compromised teeth only when the chemical agent was actively applied (burnishing technique). CarisolvTM failed to remove the smear layer completely, especially with a single application, independently of the method of application. Multiple applications of CarisolvTM were necessary to achieve a smear layer reduction comparable to that obtained with 24% EDTA conditioning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to evaluate the effects of different power parameters of an Erbium, Cromium: Yttrium, Scandium, Gallium, Garnet laser (Er,Cr:YSGG laser) on the morphology, attachment of blood components (ABC), roughness, and wear on irradiated root surfaces. Sixty-five incisive bovine teeth were used in this study, 35 of which were used for the analysis of root surface morphology and ABC. The remaining 30 teeth were used for roughness and root wear analysis. The samples were randomly allocated into seven groups: G1: Er,Cr:YSGG laser, 0.5 W; G2: Er,Cr:YSGG laser, 1.0 W; G3: Er,Cr:YSGG laser, 1.5 W; G4: Er,Cr:YSGG laser, 2.0 W; G5: Er,Cr:YSGG laser, 2.5 W; G6: Er,Cr:YSGG laser, 3.0 W; G7: scaling and root planning (SRP) with manual curettes. The root surfaces irradiated by Er,Cr:YSGG at 1.0 W and scaling with manual curettes presented the highest degrees of ABC. The samples irradiated by the Er,Cr:YSGG laser were rougher than the samples treated by the manual curette, and increasing the laser power parameters caused more root wear and greater roughness on the root surface. The Er,Cr:YSGG laser is safe to use for periodontal treatment, but it is not appropriate to use irradiation greater than 1.0 W for this purpose. Microsc. Res. Tech. 78:529–535, 2015. © 2015 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a systematic method for the generation and treatment of the alarms' graphs, being its final object to find the Alarm Root Cause of the Massive Alarms that are produced in the dispatching centers. Although many works about this matter have been already developed, the problem about the alarm management in the industry is still completely unsolved. In this paper, a simple statistic analysis of the historical data base is conducted. The results obtained by the acquisition alarm systems, are used to generate a directed graph from which the more significant alarms are extracted, previously analyzing any possible case in which a great quantity of alarms are produced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impact of end customer quality complaints with direct relationship with automotive components has presented negative trend at European level for the entire automotive industry. Thus, this research proposal is to concentrate efforts on the most important items of Pareto chart and understand the failure type and the mechanism involved, link and impact of the project and parameters on the process, ending it with the development of one of the company’s most desired tool, that hosted this project – European methodology of terminals defects classification, and listing real opportunities for improvement based on measurement and analysis of actual data. Through the development of terminals defects classification methodology, which is considered a valuable asset to the company, all the other companies of the YAZAKI’s group will be able to characterize terminals as brittle or ductile, in order to put in motion, more efficiently, all the other different existing internal procedures for the safeguarding of the components, improving manufacturing efficiency. Based on a brief observation, nothing can be said in absolute sense, concerning the failure causes. Base materials, project, handling during manufacture and storage, as well as the cold work performed by plastic deformation, all play an important role. However, it was expected that this failure has been due to a combination of factors, in detriment of the existence of a single cause. In order to acquire greater knowledge about this problem, unexplored by the company up to the date of commencement of this study, was conducted a thorough review of existing literature on the subject, real production sites were visited and, of course, the actual parts were tested in lab environment. To answer to many of the major issues raised throughout the investigation, were used extensively some theoretical concepts focused on the literature review, with a view to realizing the relationship existing between the different parameters concerned. Should here be stated that finding technical studies on copper and its alloys is really hard, not being given all the desirable information. This investigation has been performed as a YAZAKI Europe Limited Company project and as a Master Thesis for Instituto Superior de Engenharia do Porto, conducted during 9 months between 2012/2013.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we estimate the ability of the Bertini cascade to simulate Compact Muon Solenoid (CMS) hadron calorimeter HCAL. LHC test beam activity has a tightly coupled cycle of simulation-to-data analysis. Typically, a Geant4 computer experiment is used to understand test beam measurements. Thus an another aspect of this thesis is a description of studies related to developing new CMS H2 test beam data analysis tools and performing data analysis on the basis of CMS Monte Carlo events. These events have been simulated in detail using Geant4 physics models, full CMS detector description, and event reconstruction. Using the ROOT data analysis framework we have developed an offline ANN-based approach to tag b-jets associated with heavy neutral Higgs particles, and we show that this kind of NN methodology can be successfully used to separate the Higgs signal from the background in the CMS experiment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A number of methods are available for those researchers considering the addition of molecular analyses of ectomycorrhizal (EcM) fungi to their research projects and weighing the various approaches they might take. Analyzing natural EcM fungal communities has traditionally been a highly skilled, time-consuming process relying heavily on exacting morphological characterization of EcM root tips. Increasingly powerful molecular methods for analyzing EcM communities make this area of research available to a much wider range of researchers. Ecologists can gain from the body of work characterizing EcM while avoiding the requirement for exceptional expertise by carefully combining elements of traditional methods with the more recent molecular approaches. A cursory morphological analysis can yield a traditional quantification of EcM fungi based on tip numbers, a unit with functional and historical significance. Ectomycorrhizal root DNA extracts may then be analyzed with molecular methods widely used for characterizing microbiota. These range from methods applicable only to the simple mixes resulting from careful morphotyping, to community-oriented methods that identify many types in mixed samples as well as provide an estimate of their relative abundances. Extramatrical hyphae in bulk soil can also be more effectively studied, extending characterization of EcM fungal communities beyond the rhizoplane. The trend toward techniques permitting larger sample sets without prohibitive labor and time requirements will also permit us to more frequently address the issues of spatial and temporal variability and better characterize the roles of EcM fungi at multiple scales.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This dissertation sets out to provide immanent critique and deconstruction of ecological modernisation or ecomodernism.It does so, from a critical social theory approach, in order to correctly address the essential issues at the heart of the environmental crisis that ecomodernism purports to address. This critical approach argues that the solution to the environmental crisis can only be concretely achieved by recognising its root cause as being foremost the issue of material interaction between classes in society, and not simply between society and nature in any structurally meaningful way. Based on a metaphysic of false dualism, ecological modernisation attributes a materiality of exchange value relations to issues of society, while simultaneously offering a non- material ontology to issues of nature. Thus ecomodernism serves asymmetrical relations of power whereby, as a polysemic policy discourse, it serves the material interests of those who have the power to impose abstract interpretations on the materiality of actual phenomena. The research of this dissertation is conducted by the critical evaluation of the empirical data from two exemplary Irish case studies. Discovery of the causal processes of the various public issues in the case studies and thereafter the revelation of the meaning structures under- pinning such causal processes, is a theoretically- driven task requiring analysis of those social practices found in the cognitive, cultural and structural constitutions respectively of actors, mediations and systems.Therefore, the imminent critique of the case study paradigms serves as a research strategy for comprehending Ireland’s nature- society relations as influenced essentially by a systems (techno- corporatist) ecomodernist discourse. Moreover, the deconstruction of this systems ideological discourse serves not only to demonstrate how weak ecomodernism practically undermines its declared ecological objectives, but also indicates how such objectives intervene as systemic contradictions at the cultural heart of Ireland’s late modernisation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a statistical-based fault diagnosis scheme for application to internal combustion engines. The scheme relies on an identified model that describes the relationships between a set of recorded engine variables using principal component analysis (PCA). Since combustion cycles are complex in nature and produce nonlinear relationships between the recorded engine variables, the paper proposes the use of nonlinear PCA (NLPCA). The paper further justifies the use of NLPCA by comparing the model accuracy of the NLPCA model with that of a linear PCA model. A new nonlinear variable reconstruction algorithm and bivariate scatter plots are proposed for fault isolation, following the application of NLPCA. The proposed technique allows the diagnosis of different fault types under steady-state operating conditions. More precisely, nonlinear variable reconstruction can remove the fault signature from the recorded engine data, which allows the identification and isolation of the root cause of abnormal engine behaviour. The paper shows that this can lead to (i) an enhanced identification of potential root causes of abnormal events and (ii) the masking of faulty sensor readings. The effectiveness of the enhanced NLPCA based monitoring scheme is illustrated by its application to a sensor fault and a process fault. The sensor fault relates to a drift in the fuel flow reading, whilst the process fault relates to a partial blockage of the intercooler. These faults are introduced to a Volkswagen TDI 1.9 Litre diesel engine mounted on an experimental engine test bench facility.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Rough Set Data Analysis (RSDA) is a non-invasive data analysis approach that solely relies on the data to find patterns and decision rules. Despite its noninvasive approach and ability to generate human readable rules, classical RSDA has not been successfully used in commercial data mining and rule generating engines. The reason is its scalability. Classical RSDA slows down a great deal with the larger data sets and takes much longer times to generate the rules. This research is aimed to address the issue of scalability in rough sets by improving the performance of the attribute reduction step of the classical RSDA - which is the root cause of its slow performance. We propose to move the entire attribute reduction process into the database. We defined a new schema to store the initial data set. We then defined SOL queries on this new schema to find the attribute reducts correctly and faster than the traditional RSDA approach. We tested our technique on two typical data sets and compared our results with the traditional RSDA approach for attribute reduction. In the end we also highlighted some of the issues with our proposed approach which could lead to future research.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Les épilepsies génétiques généralisées (ÉGGs) sont un groupe de syndromes épileptiques hétérogènes qui se manifestent habituellement durant les périodes de l’enfance et de l’adolescence. Les ÉGGs représentent 30% de toutes les épilepsies. Il n’existe présentement aucun remède à l’épilepsie génétique généralisée. Au sein de ce groupe d’épilepsies, les sujets sont le plus souvent dépourvus de lésions cérébrales, ce qui signifie que les facteurs génétiques jouent un rôle important dans l’étiologie de la maladie. Au cours des dernières années, plusieurs gènes impliqués dans des formes familiales d’ÉGG ont été identifiés. La majorité d'entre elles codent pour des canaux ioniques incluant le récepteur-ligand GABAA (RGABAA). De ce groupe, des mutations ont été identifiées dans quatre sous-unités du récepteur GABAA. Dans un premier temps, l’objectif général de cette thèse vise l’évaluation de la composante génétique de notre cohorte d’ÉGG expliquée par les gènes codant pour les sous-unités du récepteur GABAA. Puis, dans un second souffle, le rôle des variants identifiés est défini et analysé afin de mieux cerner leurs impacts dans la pathogénèse de ce phénotype. La première partie du projet consiste en une analyse exhaustive des mutations existantes dans la partie codante des 19 gènes GABRA pour des patients atteints d’ÉGG. En criblant des familles québécoises avec ÉGG, nous avons identifié 22 variants rares incluant 19 faux-sens et 3 non-sens dans 14 sous-unités du RGABAA. En séquençant ces gènes dans une grande cohorte de cas et de contrôles, nous avons établi le profil des variations rares pour ceux-ci. Ces données suggèrent qu’une proportion significative (8%) des patients atteints d’ÉGG ont des variants rares sur les gènes du RGABAA. La deuxième partie porte directement sur certains gènes identifiés lors de la première partie. De ce groupe, cinq nouvelles mutations ont été découvertes dans des gènes déjà associés à l’épilepsie (GABRA1 et GABRG2). Nous avons constaté l’impact de ces mutations dans les mécanismes génétiques de l’épilepsie, en mesurant les effets des variants sur la structure et la fonction du récepteur GABAA. La troisième partie se concentre sur notre hypothèse, voulant que les RGABAA mutants altèrent l’effet du GABA durant le développement du système nerveux central (SNC). L’objectif principal vise à déterminer la contribution relative de chacune des sous-unités mutées dans le développement du SNC. Ainsi, nous avons démontré qu’une telle perte de fonction a un impact significatif sur le développement des synapses GABAergiques et glutamatergiques ainsi que sur la plasticité des circuits corticaux. Nos résultats nous ont permis de préciser comment les mutations dans les gènes GABRA peuvent mener à l’ÉGG. Éventuellement, la caractérisation moléculaire de ces mutations contribuera à l’élaboration de nouveaux outils diagnostiques et facilitera la mise au point de traitements mieux ciblés pour les gens atteints de cette condition neurologique chronique.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis proposes a framework for identifying the root-cause of a voltage disturbance, as well as, its source location (upstream/downstream) from the monitoring place. The framework works with three-phase voltage and current waveforms collected in radial distribution networks without distributed generation. Real-world and synthetic waveforms are used to test it. The framework involves features that are conceived based on electrical principles, and assuming some hypothesis on the analyzed phenomena. Features considered are based on waveforms and timestamp information. Multivariate analysis of variance and rule induction algorithms are applied to assess the amount of meaningful information explained by each feature, according to the root-cause of the disturbance and its source location. The obtained classification rates show that the proposed framework could be used for automatic diagnosis of voltage disturbances collected in radial distribution networks. Furthermore, the diagnostic results can be subsequently used for supporting power network operation, maintenance and planning.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Internal combustion engines release about 1/3 of the energy bound in the fuel as exhaust waste gas energy and another 1/3 energy is wasted through heat transfer into the ambient. On the other hand losses through friction are the third largest root cause for energy loss in internal combustion engines. During city driving frictional losses can be of the same size as the effective work, and during cold start these losses are even bigger. Therefore it is obvious to utilise wasted exhaust energy to warm up the engine oil directly. Frictional losses of any engine can be reduced during part load. Sensitivity analyses have been conducted for different concepts that utilise exhaust energy to reduce engine viscosity and friction. For a new system with an exhaust gas/oil heat exchanger the following benefits have been demonstrated:

• Fuel consumption reductions of over 7% measured as an average over 5 NEDC tests
compared to the standard system configuration.
• Significant reductions in exhaust emissions, mainly CO and NOx have been achieved
• Significantly higher oil temperatures during cold start indicate large potential to
reduce engine wear through reduced water condensation in the crankcase
• Fuel consumption reductions of further 3.3% to 4.6% compared to the 7% measured
over the NEDC test can be expected under real world customer usage conditions at
lower ambient temperatures.

Oil temperature measurements and analysis resulted in the idea of a novel system with further potential to reduce fuel consumption. This Oil Viscosity Energy Recovery System (OVER 7™) consists of 3 key features that add significant synergies if combined in a certain way: an oil warm up circuit/bypass, including oil pressure control and Exhaust Gas/Oil Heat Exchanger. The system separates the thermal inertias of the oil in the engine galleries and the oil pan, reduces hydraulic pumping losses, increases the heat transfer from the cylinder head to the oil, and utilises the exhaust heat to reduce oil friction.

The project demonstrated that sensitivity analysis is an important tool for the evaluation of different concepts. Especially for new concepts that include transient heat transfer such a qualitative approach in combination with accurate experiments and measurements can be faster and more efficient in leading to the desired improvements compared to time consuming detailed simulations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The competitiveness of the trade generated by the higher availability of products with lower quality and cost promoted a new reality of industrial production with small clearances. Track deviations at the production are not discarded, uncertainties can statistically occur. The world consumer and the Brazilian one are supported by the consumer protection code, in lawsuits against the products poor quality. An automobile is composed of various systems and thousands of constituent parts, increasing the likelihood of failure. The dynamic and security systems are critical in relation to the consequences of possible failures. The investigation of the failure gives us the possibility of learning and contributing to various improvements. Our main purpose in this work is to develop a systematic, specific methodology by investigating the root cause of the flaw occurred on an axle end of the front suspension of an automobile, and to perform comparative data analyses between the fractured part and the project information. Our research was based on a flaw generated in an automotive suspension system involved in a mechanical judicial cause, resulting in property and personal damages. In the investigations concerning the analysis of mechanical flaws, knowledge on materials engineering plays a crucial role in the process, since it enables applying techniques for characterizing materials, relating the technical attributes required from a respective part with its structure of manufacturing material, thus providing a greater scientific contribution to the work. The specific methodology developed follows its own flowchart. In the early phase, the data in the records and information on the involved ones were collected. The following laboratory analyses were performed: macrography of the fracture, micrography with SEM (Scanning Electron Microscope) of the initial and final fracture, phase analysis with optical microscopy, Brinell hardness and Vickers microhardness analyses, quantitative and qualitative chemical analysis, by using X-ray fluorescence and optical spectroscopy for carbon analysis, qualitative study on the state of tension was done. Field data were also collected. In the analyses data of the values resulting from the fractured stock parts and the design values were compared. After the investigation, one concluded that: the developed methodology systematized the investigation and enabled crossing data, thus minimizing diagnostic error probability, the morphology of the fracture indicates failure by the fatigue mechanism in a geometrically propitious location, a tension hub, the part was subjected to low tensions by the sectional area of the final fracture, the manufacturing material of the fractured part has low ductility, the component fractured in an earlier moment than the one recommended by the manufacturer, the percentages of C, Si, Mn and Cr of the fractured part present values which differ from the design ones, the hardness value of the superior limit of the fractured part is higher than that of the design, and there is no manufacturing uniformity between stock and fractured part. The work will contribute to optimizing the guidance of the actions in a mechanical engineering judicial expertise