852 resultados para Sensitivity kernel
Resumo:
In Brazil, human and canine visceral leishmaniasis (CVL) caused byLeishmania infantum has undergone urbanisation since 1980, constituting a public health problem, and serological tests are tools of choice for identifying infected dogs. Until recently, the Brazilian zoonoses control program recommended enzyme-linked immunosorbent assays (ELISA) and indirect immunofluorescence assays (IFA) as the screening and confirmatory methods, respectively, for the detection of canine infection. The purpose of this study was to estimate the accuracy of ELISA and IFA in parallel or serial combinations. The reference standard comprised the results of direct visualisation of parasites in histological sections, immunohistochemical test, or isolation of the parasite in culture. Samples from 98 cases and 1,327 noncases were included. Individually, both tests presented sensitivity of 91.8% and 90.8%, and specificity of 83.4 and 53.4%, for the ELISA and IFA, respectively. When tests were used in parallel combination, sensitivity attained 99.2%, while specificity dropped to 44.8%. When used in serial combination (ELISA followed by IFA), decreased sensitivity (83.3%) and increased specificity (92.5%) were observed. Serial testing approach improved specificity with moderate loss in sensitivity. This strategy could partially fulfill the needs of public health and dog owners for a more accurate diagnosis of CVL.
Resumo:
We describe here a very simple modification of the auramine staining procedure based on preparation of a UV-fixed thick blotch which allowed us to reach an overall sensitivity of 0.82 (592 acid-fast bacillus [AFB]-positive specimens/722 initial respiratory specimens with positive mycobacterial culture) and sensitivities of 0.93 (526 AFB-positive specimens/564 culture-positive specimens) for Mycobacterium tuberculosis complex and 0.42 (66 AFB-positive specimens/158 culture-positive specimens) for nontuberculous mycobacteria.
Resumo:
PURPOSE: To investigate the influence of demethylation with 5-aza-cytidine (AZA) on radiation sensitivity and to define the intrinsic radiation sensitivity of methylation deficient colorectal carcinoma cells. METHODS AND MATERIALS: Radiation sensitizing effects of AZA were investigated in four colorectal carcinoma cell lines (HCT116, SW480, L174 T, Co115), defining influence of AZA on proliferation, clonogenic survival, and cell cycling with or without ionizing radiation. The methylation status for cancer or DNA damage response-related genes silenced by promoter methylation was determined. The effect of deletion of the potential target genes (DNMT1, DNMT3b, and double mutants) on radiation sensitivity was analyzed. RESULTS: AZA showed radiation sensitizing properties at >or=1 micromol/l, a concentration that does not interfere with the cell cycle by itself, in all four tested cell lines with a sensitivity-enhancing ratio (SER) of 1.6 to 2.1 (confidence interval [CI] 0.9-3.3). AZA successfully demethylated promoters of p16 and hMLH1, genes associated with ionizing radiation response. Prolonged exposure to low-dose AZA resulted in sustained radiosensitivity if associated with persistent genomic hypomethylation after recovery from AZA. Compared with maternal HCT116 cells, DNMT3b-defcient deficient cells were more sensitive to radiation with a SER of 2.0 (CI 0.9-2.1; p = 0.03), and DNMT3b/DNMT1-/- double-deficient cells showed a SER of 1.6 (CI 0.5-2.7; p = 0.09). CONCLUSIONS: AZA-induced genomic hypomethylation results in enhanced radiation sensitivity in colorectal carcinoma. The mediators leading to sensitization remain unknown. Defining the specific factors associated with radiation sensitization after genomic demethylation may open the way to better targeting for the purpose of radiation sensitization.
Resumo:
High-resolution tomographic imaging of the shallow subsurface is becoming increasingly important for a wide range of environmental, hydrological and engineering applications. Because of their superior resolution power, their sensitivity to pertinent petrophysical parameters, and their far reaching complementarities, both seismic and georadar crosshole imaging are of particular importance. To date, corresponding approaches have largely relied on asymptotic, ray-based approaches, which only account for a very small part of the observed wavefields, inherently suffer from a limited resolution, and in complex environments may prove to be inadequate. These problems can potentially be alleviated through waveform inversion. We have developed an acoustic waveform inversion approach for crosshole seismic data whose kernel is based on a finite-difference time-domain (FDTD) solution of the 2-D acoustic wave equations. This algorithm is tested on and applied to synthetic data from seismic velocity models of increasing complexity and realism and the results are compared to those obtained using state-of-the-art ray-based traveltime tomography. Regardless of the heterogeneity of the underlying models, the waveform inversion approach has the potential of reliably resolving both the geometry and the acoustic properties of features of the size of less than half a dominant wavelength. Our results do, however, also indicate that, within their inherent resolution limits, ray-based approaches provide an effective and efficient means to obtain satisfactory tomographic reconstructions of the seismic velocity structure in the presence of mild to moderate heterogeneity and in absence of strong scattering. Conversely, the excess effort of waveform inversion provides the greatest benefits for the most heterogeneous, and arguably most realistic, environments where multiple scattering effects tend to be prevalent and ray-based methods lose most of their effectiveness.
Resumo:
Purpose: To investigate the effect of incremental increases in intraocular straylight on threshold measurements made by three modern forms of perimetry: Standard Automated Perimetry (SAP) using Octopus (Dynamic, G-Pattern), Pulsar Perimetry (PP) (TOP, 66 points) and the Moorfields Motion Displacement Test (MDT) (WEBS, 32 points).Methods: Four healthy young observers were recruited (mean age 26yrs [25yrs, 28yrs]), refractive correction [+2 D, -4.25D]). Five white opacity filters (WOF), each scattering light by different amounts were used to create incremental increases in intraocular straylight (IS). Resultant IS values were measured with each WOF and at baseline (no WOF) for each subject using a C-Quant Straylight Meter (Oculus, Wetzlar, Germany). A 25 yr old has an IS value of ~0.85 log(s). An increase of 40% in IS to 1.2log(s) corresponds to the physiological value of a 70yr old. Each WOFs created an increase in IS between 10-150% from baseline, ranging from effects similar to normal aging to those found with considerable cataract. Each subject underwent 6 test sessions over a 2-week period; each session consisted of the 3 perimetric tests using one of the five WOFs and baseline (both instrument and filter were randomised).Results: The reduction in sensitivity from baseline was calculated. A two-way ANOVA on mean change in threshold (where subjects were treated as rows in the block and each increment in fog filters was treated as column) was used to examine the effect of incremental increases in straylight. Both SAP (p<0.001) and Pulsar (p<0.001) were significantly affected by increases in straylight. The MDT (p=0.35) remained comparatively robust to increases in straylight.Conclusions: The Moorfields MDT measurement of threshold is robust to effects of additional straylight as compared to SAP and PP.
Resumo:
Nowadays, the joint exploitation of images acquired daily by remote sensing instruments and of images available from archives allows a detailed monitoring of the transitions occurring at the surface of the Earth. These modifications of the land cover generate spectral discrepancies that can be detected via the analysis of remote sensing images. Independently from the origin of the images and of type of surface change, a correct processing of such data implies the adoption of flexible, robust and possibly nonlinear method, to correctly account for the complex statistical relationships characterizing the pixels of the images. This Thesis deals with the development and the application of advanced statistical methods for multi-temporal optical remote sensing image processing tasks. Three different families of machine learning models have been explored and fundamental solutions for change detection problems are provided. In the first part, change detection with user supervision has been considered. In a first application, a nonlinear classifier has been applied with the intent of precisely delineating flooded regions from a pair of images. In a second case study, the spatial context of each pixel has been injected into another nonlinear classifier to obtain a precise mapping of new urban structures. In both cases, the user provides the classifier with examples of what he believes has changed or not. In the second part, a completely automatic and unsupervised method for precise binary detection of changes has been proposed. The technique allows a very accurate mapping without any user intervention, resulting particularly useful when readiness and reaction times of the system are a crucial constraint. In the third, the problem of statistical distributions shifting between acquisitions is studied. Two approaches to transform the couple of bi-temporal images and reduce their differences unrelated to changes in land cover are studied. The methods align the distributions of the images, so that the pixel-wise comparison could be carried out with higher accuracy. Furthermore, the second method can deal with images from different sensors, no matter the dimensionality of the data nor the spectral information content. This opens the doors to possible solutions for a crucial problem in the field: detecting changes when the images have been acquired by two different sensors.
Resumo:
Moisture sensitivity of Hot Mix Asphalt (HMA) mixtures, generally called stripping, is a major form of distress in asphalt concrete pavement. It is characterized by the loss of adhesive bond between the asphalt binder and the aggregate (a failure of the bonding of the binder to the aggregate) or by a softening of the cohesive bonds within the asphalt binder (a failure within the binder itself), both of which are due to the action of loading under traffic in the presence of moisture. The evaluation of HMA moisture sensitivity has been divided into two categories: visual inspection test and mechanical test. However, most of them have been developed in pre-Superpave mix design. This research was undertaken to develop a protocol for evaluating the moisture sensitivity potential of HMA mixtures using the Nottingham Asphalt Tester (NAT). The mechanisms of HMA moisture sensitivity were reviewed and the test protocols using the NAT were developed. Different types of blends as moisture-sensitive groups and non-moisture-sensitive groups were used to evaluate the potential of the proposed test. The test results were analyzed with three parameters based on performance character: the retained flow number depending on critical permanent deformation failure (RFNP), the retained flow number depending on cohesion failure (RFNC), and energy ratio (ER). Analysis based on energy ratio of elastic strain (EREE ) at flow number of cohesion failure (FNC) has higher potential to evaluate the HMA moisture sensitivity than other parameters. If the measurement error in data-acquisition process is removed, analyses based on RFNP and RFNC would also have high potential to evaluate the HMA moisture sensitivity. The vacuum pressure saturation used in AASHTO T 283 and proposed test has a risk to damage specimen before the load applying.
Resumo:
The performance of the Xpert MRSA polymerase chain reaction (PCR) assay on pooled nose, groin, and throat swabs (three nylon flocked eSwabs into one tube) was compared to culture by analyzing 5,546 samples. The sensitivity [0.78, 95 % confidence interval (CI) 0.73-0.82] and specificity (0.99, 95 % CI 0.98-0.99) were similar to the results from published studies on separated nose or other specimens. Thus, the performance of the Xpert MRSA assay was not affected by pooling the three specimens into one assay, allowing a higher detection rate without increasing laboratory costs, as compared to nose samples alone.
Resumo:
In this paper we study the relevance of multiple kernel learning (MKL) for the automatic selection of time series inputs. Recently, MKL has gained great attention in the machine learning community due to its flexibility in modelling complex patterns and performing feature selection. In general, MKL constructs the kernel as a weighted linear combination of basis kernels, exploiting different sources of information. An efficient algorithm wrapping a Support Vector Regression model for optimizing the MKL weights, named SimpleMKL, is used for the analysis. In this sense, MKL performs feature selection by discarding inputs/kernels with low or null weights. The approach proposed is tested with simulated linear and nonlinear time series (AutoRegressive, Henon and Lorenz series).
Resumo:
A presente dissertação visa retratar a exploração do suporte do protocolo internet versão seis (IPv6) no kernel do Linux, conjuntamente com a análise detalhada do estado de implementação dos diferentes aspectos em que se baseia o protocolo. O estudo incide na experimentação do funcionamento em geral do stack, a identificação de inconsistências deste em relação RFC’s respectivos, bem como a simulação laboratorial de cenários que reproduzam casos de utilização de cada uma das facilidades analisadas. O objecto desta dissertação não é explicar o funcionamento do novo protocolo IPv6, mas antes, centrar-se essencialmente na exploração do IPv6 no kernel do Linux. Não é um documento para leigos em IPv6, no entanto, optou-se por desenvolver uma parte inicial onde é abordado o essencial do protocolo: a sua evolução até à aprovação e a sua especificação. Com base no estudo realizado, explora-se o suporte do IPv6 no kernel do Linux, fazendo uma análise detalhada do estudo de implementação dos diferentes aspectos em que se baseia o protocolo. Bem como a realização de testes de conformidade IPv6 em relação aos RFC’s.
Resumo:
We present a silicon chip-based approach for the enhanced sensitivity detection of surface-immobilized fluorescent molecules. Green fluorescent protein (GFP) is bound to the silicon substrate by a disuccinimidyl terephtalate-aminosilane immobilization procedure. The immobilized organic layers are characterized by surface analysis techniques, like ellipsometry, atomic force microscopy (AFM) and X-ray induced photoelectron spectroscopy. We obtain a 20-fold enhancement of the fluorescent signal, using constructive interference effects in a fused silica dielectric layer, deposited before immobilization onto the silicon. Our method opens perspectives to increase by an order of magnitude the fluorescent response of surface immobilized DNA- or protein-based layers for a variety of biosensor applications.
Resumo:
Recently, kernel-based Machine Learning methods have gained great popularity in many data analysis and data mining fields: pattern recognition, biocomputing, speech and vision, engineering, remote sensing etc. The paper describes the use of kernel methods to approach the processing of large datasets from environmental monitoring networks. Several typical problems of the environmental sciences and their solutions provided by kernel-based methods are considered: classification of categorical data (soil type classification), mapping of environmental and pollution continuous information (pollution of soil by radionuclides), mapping with auxiliary information (climatic data from Aral Sea region). The promising developments, such as automatic emergency hot spot detection and monitoring network optimization are discussed as well.
Resumo:
CD8(+) cytotoxic T lymphocytes (CTL) can recognize and kill target cells expressing only a few cognate major histocompatibility complex (MHC) I-peptide complexes. This high sensitivity requires efficient scanning of a vast number of highly diverse MHC I-peptide complexes by the T cell receptor in the contact site of transient conjugates formed mainly by nonspecific interactions of ICAM-1 and LFA-1. Tracking of single H-2K(d) molecules loaded with fluorescent peptides on target cells and nascent conjugates with CTL showed dynamic transitions between states of free diffusion and immobility. The immobilizations were explained by association of MHC I-peptide complexes with ICAM-1 and strongly increased their local concentration in cell adhesion sites and hence their scanning by T cell receptor. In nascent immunological synapses cognate complexes became immobile, whereas noncognate ones diffused out again. Interfering with this mobility modulation-based concentration and sorting of MHC I-peptide complexes strongly impaired the sensitivity of antigen recognition by CTL, demonstrating that it constitutes a new basic aspect of antigen presentation by MHC I molecules.
Resumo:
The sequence profile method (Gribskov M, McLachlan AD, Eisenberg D, 1987, Proc Natl Acad Sci USA 84:4355-4358) is a powerful tool to detect distant relationships between amino acid sequences. A profile is a table of position-specific scores and gap penalties, providing a generalized description of a protein motif, which can be used for sequence alignments and database searches instead of an individual sequence. A sequence profile is derived from a multiple sequence alignment. We have found 2 ways to improve the sensitivity of sequence profiles: (1) Sequence weights: Usage of individual weights for each sequence avoids bias toward closely related sequences. These weights are automatically assigned based on the distance of the sequences using a published procedure (Sibbald PR, Argos P, 1990, J Mol Biol 216:813-818). (2) Amino acid substitution table: In addition to the alignment, the construction of a profile also needs an amino acid substitution table. We have found that in some cases a new table, the BLOSUM45 table (Henikoff S, Henikoff JG, 1992, Proc Natl Acad Sci USA 89:10915-10919), is more sensitive than the original Dayhoff table or the modified Dayhoff table used in the current implementation. Profiles derived by the improved method are more sensitive and selective in a number of cases where previous methods have failed to completely separate true members from false positives.