936 resultados para Target Field Method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A major component of minimally invasive cochlear implantation is atraumatic scala tympani (ST) placement of the electrode array. This work reports on a semiautomatic planning paradigm that uses anatomical landmarks and cochlear surface models for cochleostomy target and insertion trajectory computation. The method was validated in a human whole head cadaver model (n = 10 ears). Cochleostomy targets were generated from an automated script and used for consecutive planning of a direct cochlear access (DCA) drill trajectory from the mastoid surface to the inner ear. An image-guided robotic system was used to perform both, DCA and cochleostomy drilling. Nine of 10 implanted specimens showed complete ST placement. One case of scala vestibuli insertion occurred due to a registration/drilling error of 0.79 mm. The presented approach indicates that a safe cochleostomy target and insertion trajectory can be planned using conventional clinical imaging modalities, which lack sufficient resolution to identify the basilar membrane.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a method to acquire 3D light fields using a hand-held camera, and describe several computational photography applications facilitated by our approach. As our input we take an image sequence from a camera translating along an approximately linear path with limited camera rotations. Users can acquire such data easily in a few seconds by moving a hand-held camera. We include a novel approach to resample the input into regularly sampled 3D light fields by aligning them in the spatio-temporal domain, and a technique for high-quality disparity estimation from light fields. We show applications including digital refocusing and synthetic aperture blur, foreground removal, selective colorization, and others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Perceived duration is assumed to be positively related to nontemporal stimulus magnitude. Most recently, the finding that larger stimuli are perceived to last longer has been challenged to represent a mere decisional bias induced by the use of comparative duration judgments. Therefore, in the present study, the method of temporal reproduction was applied as a psychophysical procedure to quantify perceived duration. Another major goal was to investigate the influence of attention on the effect of visual stimulus size on perceived duration. For this purpose, an additional dual-task paradigm was employed. Our results not only converged with previous findings in demonstrating a functional positive relationship between nontemporal stimulus size and perceived duration, but also showed that the effect of stimulus size on perceived duration was not confined to comparative duration judgments. Furthermore, the effect of stimulus size proved to be independent of attentional resources allocated to stimulus size; nontemporal visual stimulus information does not need to be processed intentionally to influence perceived duration. Finally, the effect of nontemporal stimulus size on perceived duration was effectively modulated by the duration of the target intervals, suggesting a hitherto largely unrecognized role of temporal context for the effect of nontemporal stimulus size to become evident.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel approach to the reconstruction of depth from light field data. Our method uses dictionary representations and group sparsity constraints to derive a convex formulation. Although our solution results in an increase of the problem dimensionality, we keep numerical complexity at bay by restricting the space of solutions and by exploiting an efficient Primal-Dual formulation. Comparisons with state of the art techniques, on both synthetic and real data, show promising performances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a method to reach electric field intensity as high as 400 kV/cm in liquid argon for cathode-ground distances of several millimeters. This can be achieved by suppressing field emission from the cathode, overcoming limitations that we reported earlier.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods for tracking an object have generally fallen into two groups: tracking by detection and tracking through local optimization. The advantage of detection-based tracking is its ability to deal with target appearance and disappearance, but it does not naturally take advantage of target motion continuity during detection. The advantage of local optimization is efficiency and accuracy, but it requires additional algorithms to initialize tracking when the target is lost. To bridge these two approaches, we propose a framework for unified detection and tracking as a time-series Bayesian estimation problem. The basis of our approach is to treat both detection and tracking as a sequential entropy minimization problem, where the goal is to determine the parameters describing a target in each frame. To do this we integrate the Active Testing (AT) paradigm with Bayesian filtering, and this results in a framework capable of both detecting and tracking robustly in situations where the target object enters and leaves the field of view regularly. We demonstrate our approach on a retinal tool tracking problem and show through extensive experiments that our method provides an efficient and robust tracking solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis covers a broad part of the field of computational photography, including video stabilization and image warping techniques, introductions to light field photography and the conversion of monocular images and videos into stereoscopic 3D content. We present a user assisted technique for stereoscopic 3D conversion from 2D images. Our approach exploits the geometric structure of perspective images including vanishing points. We allow a user to indicate lines, planes, and vanishing points in the input image, and directly employ these as guides of an image warp that produces a stereo image pair. Our method is most suitable for scenes with large scale structures such as buildings and is able to skip the step of constructing a depth map. Further, we propose a method to acquire 3D light fields using a hand-held camera, and describe several computational photography applications facilitated by our approach. As the input we take an image sequence from a camera translating along an approximately linear path with limited camera rotations. Users can acquire such data easily in a few seconds by moving a hand-held camera. We convert the input into a regularly sampled 3D light field by resampling and aligning them in the spatio-temporal domain. We also present a novel technique for high-quality disparity estimation from light fields. Finally, we show applications including digital refocusing and synthetic aperture blur, foreground removal, selective colorization, and others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concentration of 11-nor-9-carboxy-Δ(9)-tetrahydrocannabinol (THCCOOH) in whole blood is used as a parameter for assessing the consumption behavior of cannabis consumers. The blood level of THCCOOH-glucuronide might provide additional information about the frequency of cannabis use. To verify this assumption, a column-switching liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the rapid and direct quantification of free and glucuronidated THCCOOH in human whole blood was newly developed. The method comprised protein precipitation, followed by injection of the processed sample onto a trapping column and subsequent gradient elution to an analytical column for separation and detection. The total LC run time was 4.5 min. Detection of the analytes was accomplished by electrospray ionization in positive ion mode and selected reaction monitoring using a triple-stage quadrupole mass spectrometer. The method was fully validated by evaluating the following parameters: linearity, lower limit of quantification, accuracy and imprecision, selectivity, extraction efficiency, matrix effect, carry-over, dilution integrity, analyte stability, and re-injection reproducibility. All acceptance criteria were analyzed and the predefined criteria met. Linearity ranged from 5.0 to 500 μg/L for both analytes. The method was successfully applied to whole blood samples from a large collective of cannabis consumers, demonstrating its applicability in the forensic field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contagious caprine pleuropneumonia (CCPP) is a highly contagious disease caused by Mycoplasma capricolum subsp. capripneumoniae that affects goats in Africa and Asia. Current available methods for the diagnosis of Mycoplasma infection, including cultivation, serological assays, and PCR, are time-consuming and require fully equipped stationary laboratories, which make them incompatible with testing in the resource-poor settings that are most relevant to this disease. We report a rapid, specific, and sensitive assay employing isothermal DNA amplification using recombinase polymerase amplification (RPA) for the detection of M. capricolum subsp. capripneumoniae. We developed the assay using a specific target sequence in M. capricolum subsp. capripneumoniae, as found in the genome sequence of the field strain ILRI181 and the type strain F38 and that was further evidenced in 10 field strains from different geographical regions. Detection limits corresponding to 5 × 10(3) and 5 × 10(4) cells/ml were obtained using genomic DNA and bacterial culture from M. capricolum subsp. capripneumoniae strain ILRI181, while no amplification was obtained from 71 related Mycoplasma isolates or from the Acholeplasma or the Pasteurella isolates, demonstrating a high degree of specificity. The assay produces a fluorescent signal within 15 to 20 min and worked well using pleural fluid obtained directly from CCPP-positive animals without prior DNA extraction. We demonstrate that the diagnosis of CCPP can be achieved, with a short sample preparation time and a simple read-out device that can be powered by a car battery, in <45 min in a simulated field setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In situ and simultaneous measurement of the three most abundant isotopologues of methane using mid-infrared laser absorption spectroscopy is demonstrated. A field-deployable, autonomous platform is realized by coupling a compact quantum cascade laser absorption spectrometer (QCLAS) to a preconcentration unit, called trace gas extractor (TREX). This unit enhances CH4 mole fractions by a factor of up to 500 above ambient levels and quantitatively separates interfering trace gases such as N2O and CO2. The analytical precision of the QCLAS isotope measurement on the preconcentrated (750 ppm, parts-per-million, µmole mole−1) methane is 0.1 and 0.5 ‰ for δ13C- and δD-CH4 at 10 min averaging time. Based on repeated measurements of compressed air during a 2-week intercomparison campaign, the repeatability of the TREX–QCLAS was determined to be 0.19 and 1.9 ‰ for δ13C and δD-CH4, respectively. In this intercomparison campaign the new in situ technique is compared to isotope-ratio mass spectrometry (IRMS) based on glass flask and bag sampling and real time CH4 isotope analysis by two commercially available laser spectrometers. Both laser-based analyzers were limited to methane mole fraction and δ13C-CH4 analysis, and only one of them, a cavity ring down spectrometer, was capable to deliver meaningful data for the isotopic composition. After correcting for scale offsets, the average difference between TREX–QCLAS data and bag/flask sampling–IRMS values are within the extended WMO compatibility goals of 0.2 and 5 ‰ for δ13C- and δD-CH4, respectively. This also displays the potential to improve the interlaboratory compatibility based on the analysis of a reference air sample with accurately determined isotopic composition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose To investigate whether nonhemodynamic resonant saturation effects can be detected in patients with focal epilepsy by using a phase-cycled stimulus-induced rotary saturation (PC-SIRS) approach with spin-lock (SL) preparation and whether they colocalize with the seizure onset zone and surface interictal epileptiform discharges (IED). Materials and Methods The study was approved by the local ethics committee, and all subjects gave written informed consent. Eight patients with focal epilepsy undergoing presurgical surface and intracranial electroencephalography (EEG) underwent magnetic resonance (MR) imaging at 3 T with a whole-brain PC-SIRS imaging sequence with alternating SL-on and SL-off and two-dimensional echo-planar readout. The power of the SL radiofrequency pulse was set to 120 Hz to sensitize the sequence to high gamma oscillations present in epileptogenic tissue. Phase cycling was applied to capture distributed current orientations. Voxel-wise subtraction of SL-off from SL-on images enabled the separation of T2* effects from rotary saturation effects. The topography of PC-SIRS effects was compared with the seizure onset zone at intracranial EEG and with surface IED-related potentials. Bayesian statistics were used to test whether prior PC-SIRS information could improve IED source reconstruction. Results Nonhemodynamic resonant saturation effects ipsilateral to the seizure onset zone were detected in six of eight patients (concordance rate, 0.75; 95% confidence interval: 0.40, 0.94) by means of the PC-SIRS technique. They were concordant with IED surface negativity in seven of eight patients (0.88; 95% confidence interval: 0.51, 1.00). Including PC-SIRS as prior information improved the evidence of the standard EEG source models compared with the use of uninformed reconstructions (exceedance probability, 0.77 vs 0.12; Wilcoxon test of model evidence, P < .05). Nonhemodynamic resonant saturation effects resolved in patients with favorable postsurgical outcomes, but persisted in patients with postsurgical seizure recurrence. Conclusion Nonhemodynamic resonant saturation effects are detectable during interictal periods with the PC-SIRS approach in patients with epilepsy. The method may be useful for MR imaging-based detection of neuronal currents in a clinical environment. (©) RSNA, 2016 Online supplemental material is available for this article.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Historically morphological features were used as the primary means to classify organisms. However, the age of molecular genetics has allowed us to approach this field from the perspective of the organism's genetic code. Early work used highly conserved sequences, such as ribosomal RNA. The increasing number of complete genomes in the public data repositories provides the opportunity to look not only at a single gene, but at organisms' entire parts list. ^ Here the Sequence Comparison Index (SCI) and the Organism Comparison Index (OCI), algorithms and methods to compare proteins and proteomes, are presented. The complete proteomes of 104 sequenced organisms were compared. Over 280 million full Smith-Waterman alignments were performed on sequence pairs which had a reasonable expectation of being related. From these alignments a whole proteome phylogenetic tree was constructed. This method was also used to compare the small subunit (SSU) rRNA from each organism and a tree constructed from these results. The SSU rRNA tree by the SCI/OCI method looks very much like accepted SSU rRNA trees from sources such as the Ribosomal Database Project, thus validating the method. The SCI/OCI proteome tree showed a number of small but significant differences when compared to the SSU rRNA tree and proteome trees constructed by other methods. Horizontal gene transfer does not appear to affect the SCI/OCI trees until the transferred genes make up a large portion of the proteome. ^ As part of this work, the Database of Related Local Alignments (DaRLA) was created and contains over 81 million rows of sequence alignment information. DaRLA, while primarily used to build the whole proteome trees, can also be applied shared gene content analysis, gene order analysis, and creating individual protein trees. ^ Finally, the standard BLAST method for analyzing shared gene content was compared to the SCI method using 4 spirochetes. The SCI system performed flawlessly, finding all proteins from one organism against itself and finding all the ribosomal proteins between organisms. The BLAST system missed some proteins from its respective organism and failed to detect small ribosomal proteins between organisms. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Various airborne aldehydes and ketones (i.e., airborne carbonyls) present in outdoor, indoor, and personal air pose a risk to human health at present environmental concentrations. To date, there is no adequate, simple-to-use sampler for monitoring carbonyls at parts per billion concentrations in personal air. The Passive Aldehydes and Ketones Sampler (PAKS) originally developed for this purpose has been found to be unreliable in a number of relatively recent field studies. The PAKS method uses dansylhydrazine, DNSH, as the derivatization agent to produce aldehyde derivatives that are analyzed by HPLC with fluorescence detection. The reasons for the poor performance of the PAKS are not known but it is hypothesized that the chemical derivatization conditions and reaction kinetics combined with a relatively low sampling rate may play a role. This study evaluated the effect of absorption and emission wavelengths, pH of the DNSH coating solution, extraction solvent, and time post-extraction for the yield and stability of formaldehyde, acetaldehyde, and acrolein DNSH derivatives. The results suggest that the optimum conditions for the analysis of DNSHydrazones are the following. The excitation and emission wavelengths for HPLC analysis should be at 250nm and 500nm, respectively. The optimal pH of the coating solution appears to be pH 2 because it improves the formation of di-derivatized acrolein DNSHydrazones without affecting the response of the derivatives of the formaldehyde and acetaldehyde derivatives. Acetonitrile is the preferable extraction solvent while the optimal time to analyze the aldehyde derivatives is 72 hours post-extraction. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In population studies, most current methods focus on identifying one outcome-related SNP at a time by testing for differences of genotype frequencies between disease and healthy groups or among different population groups. However, testing a great number of SNPs simultaneously has a problem of multiple testing and will give false-positive results. Although, this problem can be effectively dealt with through several approaches such as Bonferroni correction, permutation testing and false discovery rates, patterns of the joint effects by several genes, each with weak effect, might not be able to be determined. With the availability of high-throughput genotyping technology, searching for multiple scattered SNPs over the whole genome and modeling their joint effect on the target variable has become possible. Exhaustive search of all SNP subsets is computationally infeasible for millions of SNPs in a genome-wide study. Several effective feature selection methods combined with classification functions have been proposed to search for an optimal SNP subset among big data sets where the number of feature SNPs far exceeds the number of observations. ^ In this study, we take two steps to achieve the goal. First we selected 1000 SNPs through an effective filter method and then we performed a feature selection wrapped around a classifier to identify an optimal SNP subset for predicting disease. And also we developed a novel classification method-sequential information bottleneck method wrapped inside different search algorithms to identify an optimal subset of SNPs for classifying the outcome variable. This new method was compared with the classical linear discriminant analysis in terms of classification performance. Finally, we performed chi-square test to look at the relationship between each SNP and disease from another point of view. ^ In general, our results show that filtering features using harmononic mean of sensitivity and specificity(HMSS) through linear discriminant analysis (LDA) is better than using LDA training accuracy or mutual information in our study. Our results also demonstrate that exhaustive search of a small subset with one SNP, two SNPs or 3 SNP subset based on best 100 composite 2-SNPs can find an optimal subset and further inclusion of more SNPs through heuristic algorithm doesn't always increase the performance of SNP subsets. Although sequential forward floating selection can be applied to prevent from the nesting effect of forward selection, it does not always out-perform the latter due to overfitting from observing more complex subset states. ^ Our results also indicate that HMSS as a criterion to evaluate the classification ability of a function can be used in imbalanced data without modifying the original dataset as against classification accuracy. Our four studies suggest that Sequential Information Bottleneck(sIB), a new unsupervised technique, can be adopted to predict the outcome and its ability to detect the target status is superior to the traditional LDA in the study. ^ From our results we can see that the best test probability-HMSS for predicting CVD, stroke,CAD and psoriasis through sIB is 0.59406, 0.641815, 0.645315 and 0.678658, respectively. In terms of group prediction accuracy, the highest test accuracy of sIB for diagnosing a normal status among controls can reach 0.708999, 0.863216, 0.639918 and 0.850275 respectively in the four studies if the test accuracy among cases is required to be not less than 0.4. On the other hand, the highest test accuracy of sIB for diagnosing a disease among cases can reach 0.748644, 0.789916, 0.705701 and 0.749436 respectively in the four studies if the test accuracy among controls is required to be at least 0.4. ^ A further genome-wide association study through Chi square test shows that there are no significant SNPs detected at the cut-off level 9.09451E-08 in the Framingham heart study of CVD. Study results in WTCCC can only detect two significant SNPs that are associated with CAD. In the genome-wide study of psoriasis most of top 20 SNP markers with impressive classification accuracy are also significantly associated with the disease through chi-square test at the cut-off value 1.11E-07. ^ Although our classification methods can achieve high accuracy in the study, complete descriptions of those classification results(95% confidence interval or statistical test of differences) require more cost-effective methods or efficient computing system, both of which can't be accomplished currently in our genome-wide study. We should also note that the purpose of this study is to identify subsets of SNPs with high prediction ability and those SNPs with good discriminant power are not necessary to be causal markers for the disease.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The three articles that comprise this dissertation describe how small area estimation and geographic information systems (GIS) technologies can be integrated to provide useful information about the number of uninsured and where they are located. Comprehensive data about the numbers and characteristics of the uninsured are typically only available from surveys. Utilization and administrative data are poor proxies from which to develop this information. Those who cannot access services are unlikely to be fully captured, either by health care provider utilization data or by state and local administrative data. In the absence of direct measures, a well-developed estimation of the local uninsured count or rate can prove valuable when assessing the unmet health service needs of this population. However, the fact that these are “estimates” increases the chances that results will be rejected or, at best, treated with suspicion. The visual impact and spatial analysis capabilities afforded by geographic information systems (GIS) technology can strengthen the likelihood of acceptance of area estimates by those most likely to benefit from the information, including health planners and policy makers. ^ The first article describes how uninsured estimates are currently being performed in the Houston metropolitan region. It details the synthetic model used to calculate numbers and percentages of uninsured, and how the resulting estimates are integrated into a GIS. The second article compares the estimation method of the first article with one currently used by the Texas State Data Center to estimate numbers of uninsured for all Texas counties. Estimates are developed for census tracts in Harris County, using both models with the same data sets. The results are statistically compared. The third article describes a new, revised synthetic method that is being tested to provide uninsured estimates at sub-county levels for eight counties in the Houston metropolitan area. It is being designed to replicate the same categorical results provided by a current U.S. Census Bureau estimation method. The estimates calculated by this revised model are compared to the most recent U.S. Census Bureau estimates, using the same areas and population categories. ^