999 resultados para Technical Diagnostics


Relevância:

70.00% 70.00%

Publicador:

Resumo:

The operation of technical processes requires increasingly advanced supervision and fault diagnostics to improve reliability and safety. This paper gives an introduction to the field of fault detection and diagnostics and has short methods classification. Growth of complexity and functional importance of inertial navigation systems leads to high losses at the equipment refusals. The paper is devoted to the INS diagnostics system development, allowing identifying the cause of malfunction. The practical realization of this system concerns a software package, performing a set of multidimensional information analysis. The project consists of three parts: subsystem for analyzing, subsystem for data collection and universal interface for open architecture realization. For a diagnostics improving in small analyzing samples new approaches based on pattern recognition algorithms voting and taking into account correlations between target and input parameters will be applied. The system now is at the development stage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diagnostics is based on the characterization of mechanical system condition and allows early detection of a possible fault. Signal processing is an approach widely used in diagnostics, since it allows directly characterizing the state of the system. Several types of advanced signal processing techniques have been proposed in the last decades and added to more conventional ones. Seldom, these techniques are able to consider non-stationary operations. Diagnostics of roller bearings is not an exception of this framework. In this paper, a new vibration signal processing tool, able to perform roller bearing diagnostics in whatever working condition and noise level, is developed on the basis of two data-adaptive techniques as Empirical Mode Decomposition (EMD), Minimum Entropy Deconvolution (MED), coupled by means of the mathematics related to the Hilbert transform. The effectiveness of the new signal processing tool is proven by means of experimental data measured in a test-rig that employs high power industrial size components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monitoring of the integrity of rolling element bearings in the traction system of high speed trains is a fundamental operation in order to avoid catastrophic failures and to implement effective condition-based maintenance strategies. Diagnostics of rolling element bearings is usually based on vibration signal analysis by means of suitable signal processing techniques. The experimental validation of such techniques has been traditionally performed by means of laboratory tests on artificially damaged bearings, while their actual effectiveness in industrial applications, particularly in the field of rail transport, remains scarcely investigated. This paper will address the diagnostics of bearings taken from the service after a long term operation on a high speed train. These worn bearings have been installed on a test-rig, consisting of a complete full-scale traction system of a high speed train, able to reproduce the effects of wheel-track interaction and bogie-wheelset dynamics. The results of the experimental campaign show that suitable signal processing techniques are able to diagnose bearing failures even in this harsh and noisy application. Moreover, the most suitable location of the sensors on the traction system is also proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technical or contaminated ethanol products are sometimes ingested either accidentally or on purpose. Typical misused products are black-market liquor and automotive products, e.g., windshield washer fluids. In addition to less toxic solvents, these liquids may contain the deadly methanol. Symptoms of even lethal solvent poisoning are often non-specific at the early stage. The present series of studies was carried out to develop a method for solvent intoxication breath diagnostics to speed up the diagnosis procedure conventionally based on blood tests. Especially in the case of methanol ingestion, the analysis method should be sufficiently sensitive and accurate to determine the presence of even small amounts of methanol from the mixture of ethanol and other less-toxic components. In addition to the studies on the FT-IR method, the Dräger 7110 evidential breath analyzer was examined to determine its ability to reveal a coexisting toxic solvent. An industrial Fourier transform infrared analyzer was modified for breath testing. The sample cell fittings were widened and the cell size reduced in order to get an alveolar sample directly from a single exhalation. The performance and the feasibility of the Gasmet FT-IR analyzer were tested in clinical settings and in the laboratory. Actual human breath screening studies were carried out with healthy volunteers, inebriated homeless men, emergency room patients and methanol-intoxicated patients. A number of the breath analysis results were compared to blood test results in order to approximate the blood-breath relationship. In the laboratory experiments, the analytical performance of the Gasmet FT-IR analyzer and Dräger 7110 evidential breath analyzer was evaluated by means of artificial samples resembling exhaled breath. The investigations demonstrated that a successful breath ethanol analysis by Dräger 7110 evidential breath analyzer could exclude any significant methanol intoxication. In contrast, the device did not detect very high levels of acetone, 1-propanol and 2-propanol in simulated breath. The Dräger 7110 evidential breath ethanol analyzer was not equipped to recognize the interfering component. According to the studies the Gasmet FT-IR analyzer was adequately sensitive, selective and accurate for solvent intoxication diagnostics. In addition to diagnostics, the fast breath solvent analysis proved feasible for controlling the ethanol and methanol concentration during haemodialysis treatment. Because of the simplicity of the sampling and analysis procedure, non-laboratory personnel, such as police officers or social workers, could also operate the analyzer for screening purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Combustion oscillations in gas turbines can result in serious damage. One method used to predict such oscillations is to analyze the combustor acoustics using a simple linear model. Such a model requires a flame transfer function to describe the response of the heat release to flow perturbations inside the combustor. This paper reports on the application of Planar Laser Induced Fluorescence (PLIF) of OH radicals to analyze the response of a lean premixed flame to oncoming flow perturbations. Both self-excited oscillations and low amplitude forced oscillations at various frequencies are investigated in an atmospheric pressure model combustor rig. In order to visualize fluctuations of local fuel distribution, acetone-PLIF was also applied in non-reacting and acoustically forced flows at oscillation frequencies of 200 Hz and 510 Hz, respectively. OH-PLIF images were acquired over a range of operating parameters. The results presented in this paper originate from data sets acquired at fixed phase angles during the oscillation cycle. Comparative experiments in self excited and forced acoustic oscillations show that the flame and the combustion intensity develop similarly throughout the pressure cycle in both cases. Although the peak fluorescence intensities differ between self excited and the forced instabilities, there is a clear correspondence in the observed frequency and phase information from the two cases. This result encourages a comparison of the OH-PLIF and the acetone-PLIF results. Quantitative measurements of the equivalence ratio in specific areas of the measurement plane offer insight on the complex phenomena coupling acoustic perturbations, i.e. flow velocity fluctuations, to fluctuations in fuel distribution and combustion intensity, ultimately resulting in self excited combustion oscillations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While personalised cancer medicine holds great promise, targeting therapies to the biological characteristics of patients is limited by the number of validated biomarkers currently available. The implementation of biomarkers has undergone many challenges with few biomarkers reaching cancer patients in the clinic. There have been many biomarkers that have been published and claimed to be therapeutically useful, but few become part of the clinical decision-making process due to technical, validation and market access issues. To reduce this attrition rate, there is a significant need for policy makers and reimbursement agencies to define specific evidence requirements for the introduction of biomarkers into clinical practice. Once these requirements are more clearly defined, in an analogous manner to pharmaceuticals, researchers and diagnostic companies can better focus their biomarker research and development on meeting these specific requirements, which should lead to the more rapid introduction of new molecular oncology tests for patient benefit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of using information available from one variable X to make inferenceabout another Y is classical in many physical and social sciences. In statistics this isoften done via regression analysis where mean response is used to model the data. Onestipulates the model Y = µ(X) +ɛ. Here µ(X) is the mean response at the predictor variable value X = x, and ɛ = Y - µ(X) is the error. In classical regression analysis, both (X; Y ) are observable and one then proceeds to make inference about the mean response function µ(X). In practice there are numerous examples where X is not available, but a variable Z is observed which provides an estimate of X. As an example, consider the herbicidestudy of Rudemo, et al. [3] in which a nominal measured amount Z of herbicide was applied to a plant but the actual amount absorbed by the plant X is unobservable. As another example, from Wang [5], an epidemiologist studies the severity of a lung disease, Y , among the residents in a city in relation to the amount of certain air pollutants. The amount of the air pollutants Z can be measured at certain observation stations in the city, but the actual exposure of the residents to the pollutants, X, is unobservable and may vary randomly from the Z-values. In both cases X = Z+error: This is the so called Berkson measurement error model.In more classical measurement error model one observes an unbiased estimator W of X and stipulates the relation W = X + error: An example of this model occurs when assessing effect of nutrition X on a disease. Measuring nutrition intake precisely within 24 hours is almost impossible. There are many similar examples in agricultural or medical studies, see e.g., Carroll, Ruppert and Stefanski [1] and Fuller [2], , among others. In this talk we shall address the question of fitting a parametric model to the re-gression function µ(X) in the Berkson measurement error model: Y = µ(X) + ɛ; X = Z + η; where η and ɛ are random errors with E(ɛ) = 0, X and η are d-dimensional, and Z is the observable d-dimensional r.v.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fluorescence is a troublesome side effect in laboratory Raman studies on sulfuric acid solutions and aerosol particles. We performed experiments showing that organic matter induces fluorescence in H2SO4/H2O solutions. The intensity of the fluorescence signal appears to be almost independent of the concentration of the organic substances, but depends strongly on the sulfuric acid concentration. The ubiquity of organic substances in the atmosphere, their relatively high abundance, and the insensitivity of the fluorescence with respect to their concentrations will render most acidic natural aerosols subject to absorption and fluorescence, possibly influencing climate forcing. We show that, while fluorescence may in the future become a valuable tool of aerosol diagnostics, the concurrent absorption is too small to significantly affect the atmosphere's radiative balance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the widespread popularity of linear models for correlated outcomes (e.g. linear mixed models and time series models), distribution diagnostic methodology remains relatively underdeveloped in this context. In this paper we present an easy-to-implement approach that lends itself to graphical displays of model fit. Our approach involves multiplying the estimated margional residual vector by the Cholesky decomposition of the inverse of the estimated margional variance matrix. The resulting "rotated" residuals are used to construct an empirical cumulative distribution function and pointwise standard errors. The theoretical framework, including conditions and asymptotic properties, involves technical details that are motivated by Lange and Ryan (1989), Pierce (1982), and Randles (1982). Our method appears to work well in a variety of circumstances, including models having independent units of sampling (clustered data) and models for which all observations are correlated (e.g., a single time series). Our methods can produce satisfactory results even for models that do not satisfy all of the technical conditions stated in our theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within Africa, the burden of heart failure is significant. This arises from the increase in cardiovascular disease and associated risk factors such as hypertension and diabetes, as well as causes of heart failure which are particular to sub-Saharan Africa, such as endomyocardial fibrosis. The lack of access to echocardiography and other imaging modalities, from a cost and technical perspective, combined with the predominantly rural nature of many countries with poor transport links, means that the vast majority of people never obtain an appropriate diagnosis. Similarly, research has been limited on the causes and treatment of heart failure in Africa and in particular endemic causes such as EMF and rheumatic heart disease. This review outlines the burden of heart failure in Africa and highlights the opportunity to expand diagnosis through the use of biomarkers, in particular natriuretic peptides. This builds on the success of point-of-care testing in human immunodeficiency virus and tuberculosis which have been extensively deployed in community settings in Africa.