317 resultados para Short-circuit faults diagnostic
Resumo:
Recent research has revealed the existence of an elegant defence mechanism in plants and lower eukaryotes. The mechanism, known in plants as post-transcriptional gene silencing, works through sequence-specific degradation of RNA. It appears to be directed by double-stranded RNA, associated with the production of short 21-25 nt RNAs, and spread through the plant by a diffusible signal. The short RNAs are implicated as the guides for both a nuclease complex that degrades the mRNA and a methyltransferase complex that methylates the DNA of silenced genes. It has also been suggested that these short RNAs might be the mobile silencing signal, a suggestion that has been challenged recently.
Resumo:
Cyclostationary models for the diagnostic signals measured on faulty rotating machineries have proved to be successful in many laboratory tests and industrial applications. The squared envelope spectrum has been pointed out as the most efficient indicator for the assessment of second order cyclostationary symptoms of damages, which are typical, for instance, of rolling element bearing faults. In an attempt to foster the spread of rotating machinery diagnostics, the current trend in the field is to reach higher levels of automation of the condition monitoring systems. For this purpose, statistical tests for the presence of cyclostationarity have been proposed during the last years. The statistical thresholds proposed in the past for the identification of cyclostationary components have been obtained under the hypothesis of having a white noise signal when the component is healthy. This need, coupled with the non-white nature of the real signals implies the necessity of pre-whitening or filtering the signal in optimal narrow-bands, increasing the complexity of the algorithm and the risk of losing diagnostic information or introducing biases on the result. In this paper, the authors introduce an original analytical derivation of the statistical tests for cyclostationarity in the squared envelope spectrum, dropping the hypothesis of white noise from the beginning. The effect of first order and second order cyclostationary components on the distribution of the squared envelope spectrum will be quantified and the effectiveness of the newly proposed threshold verified, providing a sound theoretical basis and a practical starting point for efficient automated diagnostics of machine components such as rolling element bearings. The analytical results will be verified by means of numerical simulations and by using experimental vibration data of rolling element bearings.
Resumo:
Diagnostics of rolling element bearings involves a combination of different techniques of signal enhancing and analysis. The most common procedure presents a first step of order tracking and synchronous averaging, able to remove the undesired components, synchronous with the shaft harmonics, from the signal, and a final step of envelope analysis to obtain the squared envelope spectrum. This indicator has been studied thoroughly, and statistically based criteria have been obtained, in order to identify damaged bearings. The statistical thresholds are valid only if all the deterministic components in the signal have been removed. Unfortunately, in various industrial applications, characterized by heterogeneous vibration sources, the first step of synchronous averaging is not sufficient to eliminate completely the deterministic components and an additional step of pre-whitening is needed before the envelope analysis. Different techniques have been proposed in the past with this aim: The most widely spread are linear prediction filters and spectral kurtosis. Recently, a new technique for pre-whitening has been proposed, based on cepstral analysis: the so-called cepstrum pre-whitening. Owing to its low computational requirements and its simplicity, it seems a good candidate to perform the intermediate pre-whitening step in an automatic damage recognition algorithm. In this paper, the effectiveness of the new technique will be tested on the data measured on a full-scale industrial bearing test-rig, able to reproduce the harsh conditions of operation. A benchmark comparison with the traditional pre-whitening techniques will be made, as a final step for the verification of the potentiality of the cepstrum pre-whitening.
Resumo:
In the field of rolling element bearing diagnostics, envelope analysis has gained in the last years a leading role among the different digital signal processing techniques. The original constraint of constant operating speed has been relaxed thanks to the combination of this technique with the computed order tracking, able to resample signals at constant angular increments. In this way, the field of application of this technique has been extended to cases in which small speed fluctuations occur, maintaining high effectiveness and efficiency. In order to make this algorithm suitable to all industrial applications, the constraint on speed has to be removed completely. In fact, in many applications, the coincidence of high bearing loads, and therefore high diagnostic capability, with acceleration-deceleration phases represents a further incentive in this direction. This chapter presents a procedure for the application of envelope analysis to speed transients. The effect of load variation on the proposed technique will be also qualitatively addressed.
Resumo:
This paper proposes techniques to improve the performance of i-vector based speaker verification systems when only short utterances are available. Short-length utterance i-vectors vary with speaker, session variations, and the phonetic content of the utterance. Well established methods such as linear discriminant analysis (LDA), source-normalized LDA (SN-LDA) and within-class covariance normalisation (WCCN) exist for compensating the session variation but we have identified the variability introduced by phonetic content due to utterance variation as an additional source of degradation when short-duration utterances are used. To compensate for utterance variations in short i-vector speaker verification systems using cosine similarity scoring (CSS), we have introduced a short utterance variance normalization (SUVN) technique and a short utterance variance (SUV) modelling approach at the i-vector feature level. A combination of SUVN with LDA and SN-LDA is proposed to compensate the session and utterance variations and is shown to provide improvement in performance over the traditional approach of using LDA and/or SN-LDA followed by WCCN. An alternative approach is also introduced using probabilistic linear discriminant analysis (PLDA) approach to directly model the SUV. The combination of SUVN, LDA and SN-LDA followed by SUV PLDA modelling provides an improvement over the baseline PLDA approach. We also show that for this combination of techniques, the utterance variation information needs to be artificially added to full-length i-vectors for PLDA modelling.
Resumo:
A multi-resource multi-stage scheduling methodology is developed to solve short-term open-pit mine production scheduling problems as a generic multi-resource multi-stage scheduling problem. It is modelled using essential characteristics of short-term mining production operations such as drilling, sampling, blasting and excavating under the capacity constraints of mining equipment at each processing stage. Based on an extended disjunctive graph model, a shifting-bottleneck-procedure algorithm is enhanced and applied to obtain feasible short-term open-pit mine production schedules and near-optimal solutions. The proposed methodology and its solution quality are verified and validated using a real mining case study.
Resumo:
To enhance the performance of the k-nearest neighbors approach in forecasting short-term traffic volume, this paper proposed and tested a two-step approach with the ability of forecasting multiple steps. In selecting k-nearest neighbors, a time constraint window is introduced, and then local minima of the distances between the state vectors are ranked to avoid overlappings among candidates. Moreover, to control extreme values’ undesirable impact, a novel algorithm with attractive analytical features is developed based on the principle component. The enhanced KNN method has been evaluated using the field data, and our comparison analysis shows that it outperformed the competing algorithms in most cases.
Resumo:
"Principles of Addiction provides a solid understanding of the definitional and diagnostic differences between use, abuse, and disorder. It describes in great detail the characteristics of these syndromes and various etiological models. The book's three main sections examine the nature of addiction, including epidemiology, symptoms, and course; alcohol and drug use among adolescents and college students; and detailed descriptions of a wide variety of addictive behaviors and disorders, encompassing not only drugs and alcohol, but caffeine, food, gambling, exercise, sex, work, social networking, and many other areas. This volume is especially important in providing a basic introduction to the field as well as an in-depth review of our current understanding of the nature and process of addictive behaviors. Principles of Addiction is one of three volumes comprising the 2,500-page series, Comprehensive Addictive Behaviors and Disorders. This series provides the most complete collection of current knowledge on addictive behaviors and disorders to date. In short, it is the definitive reference work on addictions."--publisher website
Resumo:
This study aimed to determine if systematic variation of the diagnostic terminology embedded within written discharge information (i.e., concussion or mild traumatic brain injury, mTBI) would produce different expected symptoms and illness perceptions. We hypothesized that compared to concussion advice, mTBI advice would be associated with worse outcomes. Sixty-two volunteers with no history of brain injury or neurological disease were randomly allocated to one of two conditions in which they read a mTBI vignette followed by information that varied only by use of the embedded terms concussion (n = 28) or mTBI (n = 34). Both groups reported illness perceptions (timeline and consequences subscale of the Illness Perception Questionnaire-Revised) and expected Postconcussion Syndrome (PCS) symptoms 6 months post injury (Neurobehavioral Symptom Inventory, NSI). Statistically significant group differences due to terminology were found on selected NSI scores (i.e., total, cognitive and sensory symptom cluster scores (concussion > mTBI)), but there was no effect of terminology on illness perception. When embedded in discharge advice, diagnostic terminology affects some but not all expected outcomes. Given that such expectations are a known contributor to poor mTBI outcome, clinicians should consider the potential impact of varied terminology on their patients.
Resumo:
Background and aims: The assessment of intra-epidermal nerve fiber density (IENFD) in skin biopsies and corneal nerve fiber density (CNFD) using corneal confocal microscopy (CCM) provides promising techniques to detect small nerve fiber damage in patients with peripheral neuropathy. To help define the clinical utility of each of these techniques in patients with diabetic neuropathy we have assessed sensitivity and specificity of IENFD and CNFD in predicting the following: 1) diabetic polyneuropathy (DPN); 2) risk of foot ulceration (RFU); 3) initial small fiber neuropathy (iSFN); 4) severe small fiber neuropathy (sSFN)...
Resumo:
We propose a framework for adaptive security from hard random lattices in the standard model. Our approach borrows from the recent Agrawal-Boneh-Boyen families of lattices, which can admit reliable and punctured trapdoors, respectively used in reality and in simulation. We extend this idea to make the simulation trapdoors cancel not for a specific forgery but on a non-negligible subset of the possible challenges. Conceptually, we build a compactly representable, large family of input-dependent “mixture” lattices, set up with trapdoors that “vanish” for a secret subset which we hope the forger will target. Technically, we tweak the lattice structure to achieve “naturally nice” distributions for arbitrary choices of subset size. The framework is very general. Here we obtain fully secure signatures, and also IBE, that are compact, simple, and elegant.
Resumo:
We describe a short signature scheme that is strongly existentially unforgeable under an adaptive chosen message attack in the standard security model. Our construction works in groups equipped with an efficient bilinear map, or, more generally, an algorithm for the Decision Diffie-Hellman problem. The security of our scheme depends on a new intractability assumption we call Strong Diffie-Hellman (SDH), by analogy to the Strong RSA assumption with which it shares many properties. Signature generation in our system is fast and the resulting signatures are as short as DSA signatures for comparable security. We give a tight reduction proving that our scheme is secure in any group in which the SDH assumption holds, without relying on the random oracle model.
Resumo:
The past decade has seen an increase in the number of significant natural disasters that have caused considerable loss of life as well as damage to all property markets in the affected areas. In many cases, these natural disasters have not only caused significant property damage, but in numerous cases, have resulted in the total destruction of the property in the location. With these disasters attracting considerable media attention, the public are more aware of where these affected property markets are, as well as the overall damage to properties that have been damaged or destroyed. This heightened level of awareness has to have an impact on the participants in the property market, whether a developer, vendor seller or investor. To assess this issue, a residential property market that has been affected by a significant natural disaster over the past 2 years has been analysed to determine the overall impact of the disaster on buyer, renter and vendor behaviour, as well as prices in these residential markets. This paper is based on data from the Brisbane flood in January 2011. This natural disaster resulted in loss of life and partial and total devastation of considerable residential property sectors. Data for the research have been based on the residential sales and rental listings for each week of the study period to determine the level of activity in the specific property sectors, and these are also compared to the median house prices for the various suburbs for the same period based on suburbs being either flood affected or flood free. As there are 48 suburbs included in the study, it has been possible to group these suburbs on a socio-economic basis to determine possible differences due to location and value. Data were accessed from realestate.com.au, a free real estate site that provides details of current rental and sales listings on a suburb basis, RP Data a commercial property sales database and the Australian Bureau of Statistics. The paper found that sales listings fell immediately after the flood in the affected areas, but there was no corresponding fall or increase in sales listings in the flood-free suburbs. There was a significant decrease in the number of rental listings follow the flood as affected parties sought alternate accommodation. The greatest fall in rental listings was in areas close to the flood-affected suburbs indicating the desire to be close to the flooded property during the repair period.
Resumo:
Used frequently in food contact materials, bisphenol A (BPA) has been studied extensively in recent years, and ubiquitous exposure in the general population has been demonstrated worldwide. Characterising within- and between-individual variability of BPA concentrations is important for characterising exposure in biomonitoring studies, and this has been investigated previously in adults, but not in children. The aim of this study was to characterise the short-term variability of BPA in spot urine samples in young children. Children aged ≥2-<4 years (n = 25) were recruited from an existing cohort in Queensland Australia, and donated four spot urine samples each over a two day period. Samples were analysed for total BPA using isotope dilution online solid phase extraction-liquid chromatography-tandem mass spectrometry, and concentrations ranged from 0.53–74.5 ng/ml, with geometric mean and standard deviation of 2.70 ng/ml and 2.94 ng/ml, respectively. Sex and time of sample collection were not significant predictors of BPA concentration. The between-individual variability was approximately equal to the within-individual variability (ICC = 0.51), and this ICC is somewhat higher than previously reported literature values. This may be the result of physiological or behavioural differences between children and adults or of the relatively short exposure window assessed. Using a bootstrapping methodology, a single sample resulted in correct tertile classification approximately 70% of the time. This study suggests that single spot samples obtained from young children provide a reliable characterization of absolute and relative exposure over the short time window studied, but this may not hold true over longer timeframes.
Resumo:
Geoscientists are confronted with the challenge of assessing nonlinear phenomena that result from multiphysics coupling across multiple scales from the quantum level to the scale of the earth and from femtoseconds to the 4.5 Ga of history of our planet. We neglect in this review electromagnetic modelling of the processes in the Earth’s core, and focus on four types of couplings that underpin fundamental instabilities in the Earth. These are thermal (T), hydraulic (H), mechanical (M) and chemical (C) processes which are driven and controlled by the transfer of heat to the Earth’s surface. Instabilities appear as faults, folds, compaction bands, shear/fault zones, plate boundaries and convective patterns. Convective patterns emerge from buoyancy overcoming viscous drag at a critical Rayleigh number. All other processes emerge from non-conservative thermodynamic forces with a critical critical dissipative source term, which can be characterised by the modified Gruntfest number Gr. These dissipative processes reach a quasi-steady state when, at maximum dissipation, THMC diffusion (Fourier, Darcy, Biot, Fick) balance the source term. The emerging steady state dissipative patterns are defined by the respective diffusion length scales. These length scales provide a fundamental thermodynamic yardstick for measuring instabilities in the Earth. The implementation of a fully coupled THMC multiscale theoretical framework into an applied workflow is still in its early stages. This is largely owing to the four fundamentally different lengths of the THMC diffusion yardsticks spanning micro-metre to tens of kilometres compounded by the additional necessity to consider microstructure information in the formulation of enriched continua for THMC feedback simulations (i.e., micro-structure enriched continuum formulation). Another challenge is to consider the important factor time which implies that the geomaterial often is very far away from initial yield and flowing on a time scale that cannot be accessed in the laboratory. This leads to the requirement of adopting a thermodynamic framework in conjunction with flow theories of plasticity. This framework allows, unlike consistency plasticity, the description of both solid mechanical and fluid dynamic instabilities. In the applications we show the similarity of THMC feedback patterns across scales such as brittle and ductile folds and faults. A particular interesting case is discussed in detail, where out of the fluid dynamic solution, ductile compaction bands appear which are akin and can be confused with their brittle siblings. The main difference is that they require the factor time and also a much lower driving forces to emerge. These low stress solutions cannot be obtained on short laboratory time scales and they are therefore much more likely to appear in nature than in the laboratory. We finish with a multiscale description of a seminal structure in the Swiss Alps, the Glarus thrust, which puzzled geologists for more than 100 years. Along the Glarus thrust, a km-scale package of rocks (nappe) has been pushed 40 km over its footwall as a solid rock body. The thrust itself is a m-wide ductile shear zone, while in turn the centre of the thrust shows a mm-cm wide central slip zone experiencing periodic extreme deformation akin to a stick-slip event. The m-wide creeping zone is consistent with the THM feedback length scale of solid mechanics, while the ultralocalised central slip zones is most likely a fluid dynamic instability.