891 resultados para Fault detection and diagnostics
Resumo:
Esophageal cancer is a prevalent cancer worldwide. Some studies have reported the possible etiology of human papillomavirus (HPV) in benign and malignant papillomas of the esophagus but the conclusions are controversial. In the present study, we investigated an esophageal papilloma from a 30-year-old male patient presenting aphasia. HPV DNA was detected by generic PCR using MY09/11 primers, and restriction fragment length polymorphism revealed the presence of HPV54, usually associated with benign genital lesions. Hypermethylation of the pINK4A gene was also investigated due to its relation to malignant transformation, but no modification was detected in the host gene. Except for an incipient reflux, no risk factors such as cigarette smoking, alcohol abuse or an infected sexual partner were recorded. Since esophageal lesions may have a malignant potential, HPV detection and typing are useful tools for patient follow-up.
Resumo:
Recent Storms in Nordic countries were a reason of long power outages in huge territories. After these disasters distribution networks' operators faced with a problem how to provide adequate quality of supply in such situation. The decision of utilization cable lines rather than overhead lines were made, which brings new features to distribution networks. The main idea of this work is a complex analysis of medium voltage distribution networks with long cable lines. High value of cable’s specific capacitance and length of lines determine such problems as: high values of earth fault currents, excessive amount of reactive power flow from distribution to transmission network, possibility of a high voltage level at the receiving end of cable feeders. However the core tasks was to estimate functional ability of the earth fault protection and the possibility to utilize simplified formulas for operating setting calculations in this network. In order to provide justify solution or evaluation of mentioned above problems corresponding calculations were made and in order to analyze behavior of relay protection principles PSCAD model of the examined network have been created. Evaluation of the voltage rise in the end of a cable line have educed absence of a dangerous increase in a voltage level, while excessive value of reactive power can be a reason of final penalty according to the Finish regulations. It was proved and calculated that for this networks compensation of earth fault currents should be implemented. In PSCAD models of the electrical grid with isolated neutral, central compensation and hybrid compensation were created. For the network with hybrid compensation methodology which allows to select number and rated power of distributed arc suppression coils have been offered. Based on the obtained results from experiments it was determined that in order to guarantee selective and reliable operation of the relay protection should be utilized hybrid compensation with connection of high-ohmic resistor. Directional and admittance based relay protection were tested under these conditions and advantageous of the novel protection were revealed. However, for electrical grids with extensive cabling necessity of a complex approach to the relay protection were explained and illustrated. Thus, in order to organize reliable earth fault protection is recommended to utilize both intermittent and conventional relay protection with operational settings calculated by the use of simplified formulas.
Resumo:
The increasing presence of products derived from genetically modified (GM) plants in human and animal diets has led to the development of detection methods to distinguish biotechnology-derived foods from conventional ones. The conventional and real-time PCR have been used, respectively, to detect and quantify GM residues in highly processed foods. DNA extraction is a critical step during the analysis process. Some factors such as DNA degradation, matrix effects, and the presence of PCR inhibitors imply that a detection or quantification limit, established for a given method, is restricted to a matrix used during validation and cannot be projected to any other matrix outside the scope of the method. In Brazil, sausage samples were the main class of processed products in which Roundup Ready® (RR) soybean residues were detected. Thus, the validation of methodologies for the detection and quantification of those residues is absolutely necessary. Sausage samples were submitted to two different methods of DNA extraction: modified Wizard and the CTAB method. The yield and quality were compared for both methods. DNA samples were analyzed by conventional and real-time PCR for the detection and quantification of Roundup Ready® soybean in the samples. At least 200 ng of total sausage DNA was necessary for a reliable quantification. Reactions containing DNA amounts below this value led to large variations on the expected GM percentage value. In conventional PCR, the detection limit varied from 1.0 to 500 ng, depending on the GM soybean content in the sample. The precision, performance, and linearity were relatively high indicating that the method used for analysis was satisfactory.
Resumo:
A method using Liquid Chromatography Tanden Mass Spectrometry (LC-MS/MS) with matrix-matched calibration curve was developed and validated for determining ochratoxin A (OTA) in green coffee. Linearity was found between 3.0 and 23.0 ng.g-1. Mean recoveries ranged between 90.45% and 108.81%; the relative standard deviation under repeatability and intermediate precision conditions ranged from 5.39% to 9.94% and from 2.20% to 14.34%, respectively. The limits of detection and quantification were 1.2 ng.g-1 and 3.0 ng.g-¹, respectively. The method developed was suitable and contributed to the field of mycotoxin analysis, and it will be used for future production of the Certified Reference Material (CRM) for OTA in coffee.
Resumo:
Autism is a developmental disorder that is characterized by abnonnal social interactions and communications as well as repetitive and restricted activities and interests. There is evidence of a genetic component, as 5% of younger siblings are diagnosed if their older sibling has been diagnosed. Autism is generally not diagnosed until age 3 at the earliest, yet it has been shown that early intervention for children with autism can greatly increase their functioning. Because of this, it is important that symptoms of autism are identified as early as possible so that diagnosis can occur as soon as possible to allow these children the earliest intervention. This thesis was divided into two parts. The first looked at the psychometrics of two proposed measures, the Parent Observation Checklist (POC), administered monthly, and the Infant Behavior Summary Evaluation (mSE), administered bimonthly, to see if they can be used with the infant population to identify autistic symptoms in infants who are at high risk for autism or related problems because they have an older sibling with autism. Study 1 reported acceptable psychometric properties of both the POC and IBSE in terms of test-retest reliability, internal consistency, construct validity and predictive validity. These results provide preliminary evidence that parent report measures can help to detect early symptoms of ASD in infants. The POC was shown to differentiate infants who were diagnosed from a matched group that was not diagnosed by 3 years of age. The second part of this thesis involved a telephone interview of parents who reported developmental and/or behavior problems in their high-risk infants that may be early signs of Autism Spectrum Disorder (ASD). During the interview, a service questionnaire was administered to see what interventions (including strategies recommended by the researchers) their at risk infants and affected older siblings were receiving, how satisfied the parents were with them and how effective they felt the interventions were. 3 Study 2 also yielded promising results. Parents utilized a variety of services for at risk infants and children with ASD. The interventions included empirically validated early intervention (e.g., ABA) to non-empirically validated treatments (e.g., diet therapy). The large number of nonempirically validated treatments parents used was surprising, yet parents reported being involved and satisfied, and thought that the services were effective. Parents' perceptions of their stress levels went down slightly and feelings of competence rose when they accessed services for their infants. Overall, the results of this thesis provide new evidence that parent-report methods hold promise as early detection instruments for ASD in at-risk infants. More research is needed to further validate these instruments as well as to understand the variables related to the parents' choice of early intervention for their at risk and affected children.
Resumo:
On average approximately 13% of the water that is withdrawn by Canadian municipal water suppliers is lost before it reaches final users. This is an important topic for several reasons: water losses cost money, losses force water agencies to draw more water from lakes and streams thereby putting more stress on aquatic ecosystems, leaks reduce system reliability, leaks may contribute to future pipe failures, and leaks may allow contaminants to enter water systems thereby reducing water quality and threatening the health of water users. Some benefits of leak detection fall outside water agencies’ accounting purview (e.g. reduced health risks to households connected to public water supply systems) and, as a result, may not be considered adequately in water agency decision-making. Because of the regulatory environment in which Canadian water agencies operate, some of these benefits-especially those external to the agency or those that may accrue to the agency in future time periods- may not be fully counted when agencies decide on leak detection efforts. Our analysis suggests potential reforms to promote increased efforts for leak detection: adoption of a Canada-wide goal of universal water metering; development of full-cost accounting and, pricing for water supplies; and co-operation amongst the provinces to promulgate standards for leak detection efforts and provide incentives to promote improved efficiency and rational investment decision-making.
Advances in therapeutic risk management through signal detection and risk minimisation tool analyses
Resumo:
Les quatre principales activités de la gestion de risque thérapeutique comportent l’identification, l’évaluation, la minimisation, et la communication du risque. Ce mémoire aborde les problématiques liées à l’identification et à la minimisation du risque par la réalisation de deux études dont les objectifs sont de: 1) Développer et valider un outil de « data mining » pour la détection des signaux à partir des banques de données de soins de santé du Québec; 2) Effectuer une revue systématique afin de caractériser les interventions de minimisation de risque (IMR) ayant été implantées. L’outil de détection de signaux repose sur la méthode analytique du quotient séquentiel de probabilité (MaxSPRT) en utilisant des données de médicaments délivrés et de soins médicaux recueillis dans une cohorte rétrospective de 87 389 personnes âgées vivant à domicile et membres du régime d’assurance maladie du Québec entre les années 2000 et 2009. Quatre associations « médicament-événement indésirable (EI) » connues et deux contrôles « négatifs » ont été utilisés. La revue systématique a été faite à partir d’une revue de la littérature ainsi que des sites web de six principales agences réglementaires. La nature des RMIs ont été décrites et des lacunes de leur implémentation ont été soulevées. La méthode analytique a mené à la détection de signaux dans l'une des quatre combinaisons médicament-EI. Les principales contributions sont: a) Le premier outil de détection de signaux à partir des banques de données administratives canadiennes; b) Contributions méthodologiques par la prise en compte de l'effet de déplétion des sujets à risque et le contrôle pour l'état de santé du patient. La revue a identifié 119 IMRs dans la littérature et 1,112 IMRs dans les sites web des agences réglementaires. La revue a démontré qu’il existe une augmentation des IMRs depuis l’introduction des guides réglementaires en 2005 mais leur efficacité demeure peu démontrée.
Resumo:
L’entérotoxine B staphylococcique (SEB) est une toxine entérique hautement résistante à la chaleur et est responsable de plus de 50 % des cas d’intoxication d’origine alimentaire par une entérotoxine. L’objectif principal de ce projet de maîtrise est de développer et valider une méthode basée sur des nouvelles stratégies analytiques permettant la détection et la quantification de SEB dans les matrices alimentaires. Une carte de peptides tryptiques a été produite et 3 peptides tryptiques spécifiques ont été sélectionnés pour servir de peptides témoins à partir des 9 fragments protéolytiques identifiés (couverture de 35 % de la séquence). L’anhydride acétique et la forme deutérée furent utilisés afin de synthétiser des peptides standards marqués avec un isotope léger et lourd. La combinaison de mélanges des deux isotopes à des concentrations molaires différentes fut utilisée afin d’établir la linéarité et les résultats ont démontré que les mesures faites par dilution isotopique combinée au CL-SM/SM respectaient les critères généralement reconnus d’épreuves biologiques avec des valeurs de pente près de 1, des valeurs de R2 supérieure à 0,98 et des coefficients de variation (CV%) inférieurs à 8 %. La précision et l’exactitude de la méthode ont été évaluées à l’aide d’échantillons d’homogénat de viande de poulet dans lesquels SEB a été introduite. SEB a été enrichie à 0,2, 1 et 2 pmol/g. Les résultats analytiques révèlent que la méthode procure une plage d’exactitude de 84,9 à 91,1 %. Dans l’ensemble, les résultats présentés dans ce mémoire démontrent que les méthodes protéomiques peuvent être utilisées efficacement pour détecter et quantifier SEB dans les matrices alimentaires. Mots clés : spectrométrie de masse; marquage isotopique; protéomique quantitative; entérotoxines
Resumo:
Nous proposons une nouvelle méthode pour quantifier la vorticité intracardiaque (vortographie Doppler), basée sur l’imagerie Doppler conventionnelle. Afin de caractériser les vortex, nous utilisons un indice dénommé « Blood Vortex Signature (BVS) » (Signature Tourbillonnaire Sanguine) obtenu par l’application d’un filtre par noyau basé sur la covariance. La validation de l’indice BVS mesuré par vortographie Doppler a été réalisée à partir de champs Doppler issus de simulations et d’expériences in vitro. Des résultats préliminaires obtenus chez des sujets sains et des patients atteints de complications cardiaques sont également présentés dans ce mémoire. Des corrélations significatives ont été observées entre la vorticité estimée par vortographie Doppler et la méthode de référence (in silico: r2 = 0.98, in vitro: r2 = 0.86). Nos résultats suggèrent que la vortographie Doppler est une technique d’échographie cardiaque prometteuse pour quantifier les vortex intracardiaques. Cet outil d’évaluation pourrait être aisément appliqué en routine clinique pour détecter la présence d’une insuffisance ventriculaire et évaluer la fonction diastolique par échocardiographie Doppler.
Resumo:
By the end of 2004, the Canadian swine population had experienced a severe 2 increase in the incidence of Porcine circovirus-associated disease (PCVAD), a problem that was 3 associated with the emergence of a new Porcine circovirus-2 genotype (PCV-2b), previously 4 unrecovered in North America. Thus it became important to develop a diagnostic tool that could 5 differentiate between the old and new circulating genotypes (PCV-2a and -2b, respectively). 6 Consequently, a multiplex real-time quantitative polymerase chain reaction (mrtqPCR) assay that 7 could sensitively and specifically identify and differentiate PCV-2 genotypes was developed. A 8 retrospective epidemiological survey that used the mrtqPCR assay was performed to determine if 9 cofactors could affect the risk of PCVAD. From 121 PCV-2–positive cases gathered for this 10 study, 4.13%, 92.56% and 3.31% were positive for PCV-2a, PCV-2b, and both genotypes, 11 respectively. In a data analysis using univariate logistic regressions, PCVAD compatible 12 (PCVAD/c) score was significantly associated with the presence of Porcine reproductive and 13 respiratory syndrome virus (PRRSV), PRRSV viral load, PCV-2 viral load, and PCV-2 14 immunohistochemistry (IHC) results. Polytomous logistic regression analysis revealed that 15 PCVAD/c score was affected by PCV-2 viral load (P = 0.0161) and IHC (P = 0.0128), but not by 16 the PRRSV variables (P > 0.9); suggesting that mrtqPCR in tissue is a reliable alternative to IHC. 17 Logistic regression analyses revealed that PCV-2 increased the odds ratio of isolating 2 major 18 swine pathogens of the respiratory tract, Actinobacillus pleuropneumoniae and Streptococcus 19 suis serotypes 1/2, 1, 2, 3, 4, and 7, which are serotypes commonly associated with clinical 20 diseases.
Resumo:
Department of Computer Applications, Cochin University of Science and Technology
Resumo:
Here we investigate the diversity of pathogenic Vibrio species in marine environments close to Suva, Fiji. We use four distinct yet complementary analyses – biochemical testing, phylogenetic analyses, metagenomic analyses and molecular typing – to provide some preliminary insights into the diversity of vibrios in this region. Taken together our analyses confirmed the presence of nine Vibrio species, including three of the most important disease-causing vibrios (i.e. V. cholerae, V. parahaemolyticus and V. vulnificus), in Fijian marine environments. Furthermore, since toxigenic V. parahaemolyticus are present on fish for consumption we suggest these bacteria represent a potential public health risk. Our results from Illumina short read sequencing are encouraging in the context of microbial profiling and biomonitoring. They suggest this approach may offer an efficient and costeffective method for studying the dynamics of microbial diversity in marine environments over time.
Resumo:
The characterization and grading of glioma tumors, via image derived features, for diagnosis, prognosis, and treatment response has been an active research area in medical image computing. This paper presents a novel method for automatic detection and classification of glioma from conventional T2 weighted MR images. Automatic detection of the tumor was established using newly developed method called Adaptive Gray level Algebraic set Segmentation Algorithm (AGASA).Statistical Features were extracted from the detected tumor texture using first order statistics and gray level co-occurrence matrix (GLCM) based second order statistical methods. Statistical significance of the features was determined by t-test and its corresponding p-value. A decision system was developed for the grade detection of glioma using these selected features and its p-value. The detection performance of the decision system was validated using the receiver operating characteristic (ROC) curve. The diagnosis and grading of glioma using this non-invasive method can contribute promising results in medical image computing
Resumo:
Freehand sketching is both a natural and crucial part of design, yet is unsupported by current design automation software. We are working to combine the flexibility and ease of use of paper and pencil with the processing power of a computer to produce a design environment that feels as natural as paper, yet is considerably smarter. One of the most basic steps in accomplishing this is converting the original digitized pen strokes in the sketch into the intended geometric objects using feature point detection and approximation. We demonstrate how multiple sources of information can be combined for feature detection in strokes and apply this technique using two approaches to signal processing, one using simple average based thresholding and a second using scale space.
Resumo:
In this report, a face recognition system that is capable of detecting and recognizing frontal and rotated faces was developed. Two face recognition methods focusing on the aspect of pose invariance are presented and evaluated - the whole face approach and the component-based approach. The main challenge of this project is to develop a system that is able to identify faces under different viewing angles in realtime. The development of such a system will enhance the capability and robustness of current face recognition technology. The whole-face approach recognizes faces by classifying a single feature vector consisting of the gray values of the whole face image. The component-based approach first locates the facial components and extracts them. These components are normalized and combined into a single feature vector for classification. The Support Vector Machine (SVM) is used as the classifier for both approaches. Extensive tests with respect to the robustness against pose changes are performed on a database that includes faces rotated up to about 40 degrees in depth. The component-based approach clearly outperforms the whole-face approach on all tests. Although this approach isproven to be more reliable, it is still too slow for real-time applications. That is the reason why a real-time face recognition system using the whole-face approach is implemented to recognize people in color video sequences.