950 resultados para cut vertex false positive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background & objectives: There is a need to develop an affordable and reliable tool for hearing screening of neonates in resource constrained, medically underserved areas of developing nations. This study valuates a strategy of health worker based screening of neonates using a low cost mechanical calibrated noisemaker followed up with parental monitoring of age appropriate auditory milestones for detecting severe-profound hearing impairment in infants by 6 months of age. Methods: A trained health worker under the supervision of a qualified audiologist screened 425 neonates of whom 20 had confirmed severe-profound hearing impairment. Mechanical calibrated noisemakers of 50, 60, 70 and 80 dB (A) were used to elicit the behavioural responses. The parents of screened neonates were instructed to monitor the normal language and auditory milestones till 6 months of age. This strategy was validated against the reference standard consisting of a battery of tests - namely, auditory brain stem response (ABR), otoacoustic emissions (OAE) and behavioural assessment at 2 years of age. Bayesian prevalence weighted measures of screening were calculated. Results: The sensitivity and specificity was high with least false positive referrals for. 70 and 80 dB (A) noisemakers. All the noisemakers had 100 per cent negative predictive value. 70 and 80 dB (A) noisemakers had high positive likelihood ratios of 19 and 34, respectively. The probability differences for pre- and post- test positive was 43 and 58 for 70 and 80 dB (A) noisemakers, respectively. Interpretation & conclusions: In a controlled setting, health workers with primary education can be trained to use a mechanical calibrated noisemaker made of locally available material to reliably screen for severe-profound hearing loss in neonates. The monitoring of auditory responses could be done by informed parents. Multi-centre field trials of this strategy need to be carried out to examine the feasibility of community health care workers using it in resource constrained settings of developing nations to implement an effective national neonatal hearing screening programme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Differential occupancy of space can lead to species coexistence. The fig-fig wasp pollination system hosts species-specific pollinating and parasitic wasps that develop within galls in a nursery comprising a closed inflorescence, the syconium. This microcosm affords excellent opportunities for investigating spatial partitioning since it harbours a closed community in which all wasp species are dependent on securing safe sites inside the syconium for their developing offspring while differing in life history, egg deposition strategies and oviposition times relative to nursery development. We determined ontogenetic changes in oviposition sites available to the seven-member fig wasp community of Ficus racemosa comprising pollinators, gallers and parasitoids. We used species distribution models (SDMs) for the first time at a microcosm scale to predict patterns of spatial occurrence of nursery occupants. SDMs gave high true-positive and low false-positive site occupancy rates for most occupants indicating species specificity in oviposition sites. The nursery microcosm itself changed with syconium development and sequential egg-laying by different wasp species. The number of sites occupied by offspring of the different wasp species was negatively related to the risk of syconium abortion by the plant host following oviposition. Since unpollinated syconia are usually aborted, parasitic wasps ovipositing into nurseries at the same time as the pollinator targeted many sites, suggesting response to lower risk of syconium abortion owing to reduced risk of pollination failure compared to those species ovipositing before pollination. Wasp life history and oviposition time relative to nursery development contributed to the co-existence of nursery occupants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Glycated hemoglobin (HbA(1c)) is a `gold standard' biomarker for assessing the glycemic index of an individual. HbA(1c) is formed due to nonenzymatic glycosylation at N-terminal valine residue of the P-globin chain. Cation exchange based high performance liquid chromatography (CE HPLC) is mostly used to quantify HbA(1c), in blood sample. A few genetic variants of hemoglobin and post-translationally modified variants of hemoglobin interfere with CE HPLC-based quantification,. resulting in its false positive estimation. Using mass spectrometry, we analyzed a blood sample with abnormally high HbA(1c) (52.1%) in the CE HPLC method. The observed HbA(1c) did not corroborate the blood glucose level of the patient. A mass spectrometry based bottom up proteomics approach, intact globin chain mass analysis, and chemical modification of the proteolytic peptides identified the presence of Hb Beckman, a genetic variant of hemoglobin, in the experimental sample. A similar surface area to charge ratio between HbA(1c) and Hb Beckman might have resulted in the coelution of the variant with HbA(1c) in CE HPLC. Therefore, in the screening of diabetes mellitus through the estimation of HbA(1c), it is important to look for genetic variants of hemoglobin in samples that show abnormally high glycemic index, and HbA(1c) must be estimated using an alternative method. (C) 2015 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uncovering the demographics of extrasolar planets is crucial to understanding the processes of their formation and evolution. In this thesis, we present four studies that contribute to this end, three of which relate to NASA's Kepler mission, which has revolutionized the field of exoplanets in the last few years.

In the pre-Kepler study, we investigate a sample of exoplanet spin-orbit measurements---measurements of the inclination of a planet's orbit relative to the spin axis of its host star---to determine whether a dominant planet migration channel can be identified, and at what confidence. Applying methods of Bayesian model comparison to distinguish between the predictions of several different migration models, we find that the data strongly favor a two-mode migration scenario combining planet-planet scattering and disk migration over a single-mode Kozai migration scenario. While we test only the predictions of particular Kozai and scattering migration models in this work, these methods may be used to test the predictions of any other spin-orbit misaligning mechanism.

We then present two studies addressing astrophysical false positives in Kepler data. The Kepler mission has identified thousands of transiting planet candidates, and only relatively few have yet been dynamically confirmed as bona fide planets, with only a handful more even conceivably amenable to future dynamical confirmation. As a result, the ability to draw detailed conclusions about the diversity of exoplanet systems from Kepler detections relies critically on understanding the probability that any individual candidate might be a false positive. We show that a typical a priori false positive probability for a well-vetted Kepler candidate is only about 5-10%, enabling confidence in demographic studies that treat candidates as true planets. We also present a detailed procedure that can be used to securely and efficiently validate any individual transit candidate using detailed information of the signal's shape as well as follow-up observations, if available.

Finally, we calculate an empirical, non-parametric estimate of the shape of the radius distribution of small planets with periods less than 90 days orbiting cool (less than 4000K) dwarf stars in the Kepler catalog. This effort reveals several notable features of the distribution, in particular a maximum in the radius function around 1-1.25 Earth radii and a steep drop-off in the distribution larger than 2 Earth radii. Even more importantly, the methods presented in this work can be applied to a broader subsample of Kepler targets to understand how the radius function of planets changes across different types of host stars.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While synoptic surveys in the optical and at high energies have revealed a rich discovery phase space of slow transients, a similar yield is still awaited in the radio. Majority of the past blind surveys, carried out with radio interferometers, have suffered from a low yield of slow transients, ambiguous transient classifications, and contamination by false positives. The newly-refurbished Karl G. Jansky Array (Jansky VLA) offers wider bandwidths for accurate RFI excision as well as substantially-improved sensitivity and survey speed compared with the old VLA. The Jansky VLA thus eliminates the pitfalls of interferometric transient search by facilitating sensitive, wide-field, and near-real-time radio surveys and enabling a systematic exploration of the dynamic radio sky. This thesis aims at carrying out blind Jansky VLA surveys for characterizing the radio variable and transient sources at frequencies of a few GHz and on timescales between days and years. Through joint radio and optical surveys, the thesis addresses outstanding questions pertaining to the rates of slow radio transients (e.g. radio supernovae, tidal disruption events, binary neutron star mergers, stellar flares, etc.), the false-positive foreground relevant for the radio and optical counterpart searches of gravitational wave sources, and the beaming factor of gamma-ray bursts. The need for rapid processing of the Jansky VLA data and near-real-time radio transient search has enabled the development of state-of-the-art software infrastructure. This thesis has successfully demonstrated the Jansky VLA as a powerful transient search instrument, and it serves as a pathfinder for the transient surveys planned for the SKA-mid pathfinder facilities, viz. ASKAP, MeerKAT, and WSRT/Apertif.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The first chapter of this thesis deals with automating data gathering for single cell microfluidic tests. The programs developed saved significant amounts of time with no loss in accuracy. The technology from this chapter was applied to experiments in both Chapters 4 and 5.

The second chapter describes the use of statistical learning to prognose if an anti-angiogenic drug (Bevacizumab) would successfully treat a glioblastoma multiforme tumor. This was conducted by first measuring protein levels from 92 blood samples using the DNA-encoded antibody library platform. This allowed the measure of 35 different proteins per sample, with comparable sensitivity to ELISA. Two statistical learning models were developed in order to predict whether the treatment would succeed. The first, logistic regression, predicted with 85% accuracy and an AUC of 0.901 using a five protein panel. These five proteins were statistically significant predictors and gave insight into the mechanism behind anti-angiogenic success/failure. The second model, an ensemble model of logistic regression, kNN, and random forest, predicted with a slightly higher accuracy of 87%.

The third chapter details the development of a photocleavable conjugate that multiplexed cell surface detection in microfluidic devices. The method successfully detected streptavidin on coated beads with 92% positive predictive rate. Furthermore, chambers with 0, 1, 2, and 3+ beads were statistically distinguishable. The method was then used to detect CD3 on Jurkat T cells, yielding a positive predictive rate of 49% and false positive rate of 0%.

The fourth chapter talks about the use of measuring T cell polyfunctionality in order to predict whether a patient will succeed an adoptive T cells transfer therapy. In 15 patients, we measured 10 proteins from individual T cells (~300 cells per patient). The polyfunctional strength index was calculated, which was then correlated with the patient's progress free survival (PFS) time. 52 other parameters measured in the single cell test were correlated with the PFS. No statistical correlator has been determined, however, and more data is necessary to reach a conclusion.

Finally, the fifth chapter talks about the interactions between T cells and how that affects their protein secretion. It was observed that T cells in direct contact selectively enhance their protein secretion, in some cases by over 5 fold. This occurred for Granzyme B, Perforin, CCL4, TNFa, and IFNg. IL- 10 was shown to decrease slightly upon contact. This phenomenon held true for T cells from all patients tested (n=8). Using single cell data, the theoretical protein secretion frequency was calculated for two cells and then compared to the observed rate of secretion for both two cells not in contact, and two cells in contact. In over 90% of cases, the theoretical protein secretion rate matched that of two cells not in contact.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

No presente trabalho foram desenvolvidos modelos de classificação aplicados à mineração de dados climáticos para a previsão de eventos extremos de precipitação com uma hora de antecedência. Mais especificamente, foram utilizados dados observacionais registrados pela estação meteorológica de superfície localizada no Instituto Politécnico da Universidade do Estado do Rio de Janeiro em Nova Friburgo RJ, durante o período de 2008 a 2012. A partir desses dados foi aplicado o processo de Descoberta de Conhecimento em Banco de Dados (KDD Knowledge Discovery in Databases), composto das etapas de preparação, mineração e pós processamento dos dados. Com base no uso de algoritmos de Redes Neurais Artificiais e Árvores de Decisão para a extração de padrões que indicassem um acúmulo de precipitação maior que 10 mm na hora posterior à medição das variáveis climáticas, pôde-se notar que a utilização da observação meteorológica de micro escala para previsões de curto prazo é suscetível a altas taxas de alarmes falsos (falsos positivos). Para contornar este problema, foram utilizados dados históricos de previsões realizadas pelo Modelo Eta com resolução de 15 km, disponibilizados pelo Centro de Previsão de Tempo e Estudos Climáticos do Instituto Nacional de Pesquisas Espaciais CPTEC/INPE. De posse desses dados, foi possível calcular os índices de instabilidade relacionados à formação de situação convectiva severa na região de Nova Friburgo e então armazená-los de maneira estruturada em um banco de dados, realizando a união entre os registros de micro e meso escala. Os resultados demonstraram que a união entre as bases de dados foi de extrema importância para a redução dos índices de falsos positivos, sendo essa uma importante contribuição aos estudos meteorológicos realizados em estações meteorológicas de superfície. Por fim, o modelo com maior precisão foi utilizado para o desenvolvimento de um sistema de alertas em tempo real, que verifica, para a região estudada, a possibilidade de chuva maior que 10 mm na próxima hora.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thin film bulk acoustic wave resonator (FBAR) devices supporting simultaneously multiple resonance modes have been designed for gravimetric sensing. The mechanism for dual-mode generation within a single device has been discussed, and theoretical calculations based on finite element analysis allowed the fabrication of FBARs whose resonance modes have opposite reactions to temperature changes; one of the modes exhibiting a positive frequency shift for a rise of temperature whilst the other mode exhibits a negative shift. Both modes exhibit negative frequency shift for a mass load and hence by monitoring simultaneously both modes it is possible to distinguish whether a change in the resonance frequency is due to a mass load or temperature variation (or a combination of both), avoiding false positive/negative responses in gravimetric sensing without the need of additional reference devices or complex electronics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thin film bulk acoustic wave resonator (FBAR) devices supporting simultaneously multiple resonance modes have been designed for gravimetric sensing. The mechanism for dual-mode generation within a single device has been discussed, and theoretical calculations based on finite element analysis allowed the fabrication of FBARs whose resonance modes have opposite reactions to temperature changes; one of the modes exhibiting a positive frequency shift for a rise of temperature whilst the other mode exhibits a negative shift. Both modes exhibit negative frequency shift for a mass load and hence by monitoring simultaneously both modes it is possible to distinguish whether a change in the resonance frequency is due to a mass load or temperature variation (or a combination of both), avoiding false positive/negative responses in gravimetric sensing without the need of additional reference devices or complex electronics. © 2012 Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since protein phosphorylation is a dominant mechanism of information transfer in cells, there is a great need for methods capable of accurately elucidating sites of phosphorylation. In recent years mass spectrometry has become an increasingly viable alternative to more traditional methods of phosphorylation analysis. The present study used immobilized metal affinity chromatography (IMAC coupled with a linear ion trap mass spectrometer to analyze phosphorylated proteins in mouse liver. A total of 26 peptide sequences defining 26 sites of phosphorylation were determined. Although this number of identified phosphoproteins is not large, the approach is still of interest because a series of conservative criteria were adopted in data analysis. We note that, although the binding of non-phosphorylated peptides to the IMAC column was apparent, the improvements in high-speed scanning and quality of MS/MS spectra provided by the linear ion trap contributed to the phosphoprotein identification. Further analysis demonstrated that MS/MS/MS analysis was necessary to exclude the false-positive matches resulting from the MS/MS experiments, especially for multiphosphorylated peptides. The use of the linear ion trap considerably enabled exploitation of nanoflow-HPLC/MS/MS, and in addition MS/MS/MS has great potential in phosphoproteome research of relatively complex samples. Copyright (C) 2004 John Wiley Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mammographic mass detection is an important task for the early diagnosis of breast cancer. However, it is difficult to distinguish masses from normal regions because of their abundant morphological characteristics and ambiguous margins. To improve the mass detection performance, it is essential to effectively preprocess mammogram to preserve both the intensity distribution and morphological characteristics of regions. In this paper, morphological component analysis is first introduced to decompose a mammogram into a piecewise-smooth component and a texture component. The former is utilized in our detection scheme as it effectively suppresses both structural noises and effects of blood vessels. Then, we propose two novel concentric layer criteria to detect different types of suspicious regions in a mammogram. The combination is evaluated based on the Digital Database for Screening Mammography, where 100 malignant cases and 50 benign cases are utilized. The sensitivity of the proposed scheme is 99% in malignant, 88% in benign, and 95.3% in all types of cases. The results show that the proposed detection scheme achieves satisfactory detection performance and preferable compromises between sensitivity and false positive rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three-dimensional models which contain both geometry and texture have numerous applications such as urban planning, physical simulation, and virtual environments. A major focus of computer vision (and recently graphics) research is the automatic recovery of three-dimensional models from two-dimensional images. After many years of research this goal is yet to be achieved. Most practical modeling systems require substantial human input and unlike automatic systems are not scalable. This thesis presents a novel method for automatically recovering dense surface patches using large sets (1000's) of calibrated images taken from arbitrary positions within the scene. Physical instruments, such as Global Positioning System (GPS), inertial sensors, and inclinometers, are used to estimate the position and orientation of each image. Essentially, the problem is to find corresponding points in each of the images. Once a correspondence has been established, calculating its three-dimensional position is simply a matter of geometry. Long baseline images improve the accuracy. Short baseline images and the large number of images greatly simplifies the correspondence problem. The initial stage of the algorithm is completely local and scales linearly with the number of images. Subsequent stages are global in nature, exploit geometric constraints, and scale quadratically with the complexity of the underlying scene. We describe techniques for: 1) detecting and localizing surface patches; 2) refining camera calibration estimates and rejecting false positive surfels; and 3) grouping surface patches into surfaces and growing the surface along a two-dimensional manifold. We also discuss a method for producing high quality, textured three-dimensional models from these surfaces. Some of the most important characteristics of this approach are that it: 1) uses and refines noisy calibration estimates; 2) compensates for large variations in illumination; 3) tolerates significant soft occlusion (e.g. tree branches); and 4) associates, at a fundamental level, an estimated normal (i.e. no frontal-planar assumption) and texture with each surface patch.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Alignment is a prevalent approach for recognizing 3D objects in 2D images. A major problem with current implementations is how to robustly handle errors that propagate from uncertainties in the locations of image features. This thesis gives a technique for bounding these errors. The technique makes use of a new solution to the problem of recovering 3D pose from three matching point pairs under weak-perspective projection. Furthermore, the error bounds are used to demonstrate that using line segments for features instead of points significantly reduces the false positive rate, to the extent that alignment can remain reliable even in cluttered scenes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Karwath, A. King, R. Homology induction: the use of machine learning to improve sequence similarity searches. BMC Bioinformatics. 23rd April 2002. 3:11 Additional File Describes the title organims species declaration in one string [http://www.biomedcentral.com/content/supplementary/1471- 2105-3-11-S1.doc] Sponsorship: Andreas Karwath and Ross D. King were supported by the EPSRC grant GR/L62849.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Object detection can be challenging when the object class exhibits large variations. One commonly-used strategy is to first partition the space of possible object variations and then train separate classifiers for each portion. However, with continuous spaces the partitions tend to be arbitrary since there are no natural boundaries (for example, consider the continuous range of human body poses). In this paper, a new formulation is proposed, where the detectors themselves are associated with continuous parameters, and reside in a parameterized function space. There are two advantages of this strategy. First, a-priori partitioning of the parameter space is not needed; the detectors themselves are in a parameterized space. Second, the underlying parameters for object variations can be learned from training data in an unsupervised manner. In profile face detection experiments, at a fixed false alarm number of 90, our method attains a detection rate of 75% vs. 70% for the method of Viola-Jones. In hand shape detection, at a false positive rate of 0.1%, our method achieves a detection rate of 99.5% vs. 98% for partition based methods. In pedestrian detection, our method reduces the miss detection rate by a factor of three at a false positive rate of 1%, compared with the method of Dalal-Triggs.