934 resultados para computed tomographic scan artifact, false positive, facet subluxation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Weakly electric fish produce a dual function electric signal that makes them ideal models for the study of sensory computation and signal evolution. This signal, the electric organ discharge (EOD), is used for communication and navigation. In some families of gymnotiform electric fish, the EOD is a dynamic signal that increases in amplitude during social interactions. Amplitude increase could facilitate communication by increasing the likelihood of being sensed by others or by impressing prospective mates or rivals. Conversely, by increasing its signal amplitude a fish might increase its sensitivity to objects by lowering its electrolocation detection threshold. To determine how EOD modulations elicited in the social context affect electrolocation, I developed an automated and fast method for measuring electroreception thresholds using a classical conditioning paradigm. This method employs a moving shelter tube, which these fish occupy at rest during the day, paired with an electrical stimulus. A custom built and programmed robotic system presents the electrical stimulus to the fish, slides the shelter tube requiring them to follow, and records video of their movements. I trained the electric fish of the genus Sternopygus was trained to respond to a resistive stimulus on this apparatus in 2 days. The motion detection algorithm correctly identifies the responses 91% of the time, with a false positive rate of only 4%. This system allows for a large number of trials, decreasing the amount of time needed to determine behavioral electroreception thresholds. This novel method enables the evaluation the evolutionary interplay between two conflicting sensory forces, social communication and navigation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The growing need for fast sampling of explosives in high throughput areas has increased the demand for improved technology for the trace detection of illicit compounds. Detection of the volatiles associated with the presence of the illicit compounds offer a different approach for sensitive trace detection of these compounds without increasing the false positive alarm rate. This study evaluated the performance of non-contact sampling and detection systems using statistical analysis through the construction of Receiver Operating Characteristic (ROC) curves in real-world scenarios for the detection of volatiles in the headspace of smokeless powder, used as the model system for generalizing explosives detection. A novel sorbent coated disk coined planar solid phase microextraction (PSPME) was previously used for rapid, non-contact sampling of the headspace containers. The limits of detection for the PSPME coupled to IMS detection was determined to be 0.5-24 ng for vapor sampling of volatile chemical compounds associated with illicit compounds and demonstrated an extraction efficiency of three times greater than other commercially available substrates, retaining >50% of the analyte after 30 minutes sampling of an analyte spike in comparison to a non-detect for the unmodified filters. Both static and dynamic PSPME sampling was used coupled with two ion mobility spectrometer (IMS) detection systems in which 10-500 mg quantities of smokeless powders were detected within 5-10 minutes of static sampling and 1 minute of dynamic sampling time in 1-45 L closed systems, resulting in faster sampling and analysis times in comparison to conventional solid phase microextraction-gas chromatography-mass spectrometry (SPME-GC-MS) analysis. Similar real-world scenarios were sampled in low and high clutter environments with zero false positive rates. Excellent PSPME-IMS detection of the volatile analytes were visualized from the ROC curves, resulting with areas under the curves (AUC) of 0.85-1.0 and 0.81-1.0 for portable and bench-top IMS systems, respectively. Construction of ROC curves were also developed for SPME-GC-MS resulting with AUC of 0.95-1.0, comparable with PSPME-IMS detection. The PSPME-IMS technique provides less false positive results for non-contact vapor sampling, cutting the cost and providing an effective sampling and detection needed in high-throughput scenarios, resulting in similar performance in comparison to well-established techniques with the added advantage of fast detection in the field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. ^ We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. ^ We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. ^ We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). ^ In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Funding — Forest Enterprise Scotland and the University of Aberdeen provided funding for the project. The Carnegie Trust supported the lead author, E. McHenry, in this research through the award of a tuition fees bursary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Colorectal cancer is the third most commonly diagnosed cancer, accounting for 53,219 deaths in 2007 and an estimated 146,970 new cases in the USA during 2009. The combination of FDG PET and CT has proven to be of great benefit for the assessment of colorectal cancer. This is most evident in the detection of occult metastases, particularly intra- or extrahepatic sites of disease, that would preclude a curative procedure or in the detection of local recurrence. FDG PET is generally not used for the diagnosis of colorectal cancer although there are circumstances where PET-CT may make the initial diagnosis, particularly with its more widespread use. In addition, precancerous adenomatous polyps can also be detected incidentally on whole-body images performed for other indications; sensitivity increases with increasing polyp size. False-negative FDG PET findings have been reported with mucinous adenocarcinoma, and false-positive findings have been reported due to inflammatory conditions such as diverticulitis, colitis, and postoperative scarring. Therefore, detailed evaluation of the CT component of a PET/CT exam, including assessment of the entire colon, is essential.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human use of the oceans is increasingly in conflict with conservation of endangered species. Methods for managing the spatial and temporal placement of industries such as military, fishing, transportation and offshore energy, have historically been post hoc; i.e. the time and place of human activity is often already determined before assessment of environmental impacts. In this dissertation, I build robust species distribution models in two case study areas, US Atlantic (Best et al. 2012) and British Columbia (Best et al. 2015), predicting presence and abundance respectively, from scientific surveys. These models are then applied to novel decision frameworks for preemptively suggesting optimal placement of human activities in space and time to minimize ecological impacts: siting for offshore wind energy development, and routing ships to minimize risk of striking whales. Both decision frameworks relate the tradeoff between conservation risk and industry profit with synchronized variable and map views as online spatial decision support systems.

For siting offshore wind energy development (OWED) in the U.S. Atlantic (chapter 4), bird density maps are combined across species with weights of OWED sensitivity to collision and displacement and 10 km2 sites are compared against OWED profitability based on average annual wind speed at 90m hub heights and distance to transmission grid. A spatial decision support system enables toggling between the map and tradeoff plot views by site. A selected site can be inspected for sensitivity to a cetaceans throughout the year, so as to capture months of the year which minimize episodic impacts of pre-operational activities such as seismic airgun surveying and pile driving.

Routing ships to avoid whale strikes (chapter 5) can be similarly viewed as a tradeoff, but is a different problem spatially. A cumulative cost surface is generated from density surface maps and conservation status of cetaceans, before applying as a resistance surface to calculate least-cost routes between start and end locations, i.e. ports and entrance locations to study areas. Varying a multiplier to the cost surface enables calculation of multiple routes with different costs to conservation of cetaceans versus cost to transportation industry, measured as distance. Similar to the siting chapter, a spatial decisions support system enables toggling between the map and tradeoff plot view of proposed routes. The user can also input arbitrary start and end locations to calculate the tradeoff on the fly.

Essential to the input of these decision frameworks are distributions of the species. The two preceding chapters comprise species distribution models from two case study areas, U.S. Atlantic (chapter 2) and British Columbia (chapter 3), predicting presence and density, respectively. Although density is preferred to estimate potential biological removal, per Marine Mammal Protection Act requirements in the U.S., all the necessary parameters, especially distance and angle of observation, are less readily available across publicly mined datasets.

In the case of predicting cetacean presence in the U.S. Atlantic (chapter 2), I extracted datasets from the online OBIS-SEAMAP geo-database, and integrated scientific surveys conducted by ship (n=36) and aircraft (n=16), weighting a Generalized Additive Model by minutes surveyed within space-time grid cells to harmonize effort between the two survey platforms. For each of 16 cetacean species guilds, I predicted the probability of occurrence from static environmental variables (water depth, distance to shore, distance to continental shelf break) and time-varying conditions (monthly sea-surface temperature). To generate maps of presence vs. absence, Receiver Operator Characteristic (ROC) curves were used to define the optimal threshold that minimizes false positive and false negative error rates. I integrated model outputs, including tables (species in guilds, input surveys) and plots (fit of environmental variables, ROC curve), into an online spatial decision support system, allowing for easy navigation of models by taxon, region, season, and data provider.

For predicting cetacean density within the inner waters of British Columbia (chapter 3), I calculated density from systematic, line-transect marine mammal surveys over multiple years and seasons (summer 2004, 2005, 2008, and spring/autumn 2007) conducted by Raincoast Conservation Foundation. Abundance estimates were calculated using two different methods: Conventional Distance Sampling (CDS) and Density Surface Modelling (DSM). CDS generates a single density estimate for each stratum, whereas DSM explicitly models spatial variation and offers potential for greater precision by incorporating environmental predictors. Although DSM yields a more relevant product for the purposes of marine spatial planning, CDS has proven to be useful in cases where there are fewer observations available for seasonal and inter-annual comparison, particularly for the scarcely observed elephant seal. Abundance estimates are provided on a stratum-specific basis. Steller sea lions and harbour seals are further differentiated by ‘hauled out’ and ‘in water’. This analysis updates previous estimates (Williams & Thomas 2007) by including additional years of effort, providing greater spatial precision with the DSM method over CDS, novel reporting for spring and autumn seasons (rather than summer alone), and providing new abundance estimates for Steller sea lion and northern elephant seal. In addition to providing a baseline of marine mammal abundance and distribution, against which future changes can be compared, this information offers the opportunity to assess the risks posed to marine mammals by existing and emerging threats, such as fisheries bycatch, ship strikes, and increased oil spill and ocean noise issues associated with increases of container ship and oil tanker traffic in British Columbia’s continental shelf waters.

Starting with marine animal observations at specific coordinates and times, I combine these data with environmental data, often satellite derived, to produce seascape predictions generalizable in space and time. These habitat-based models enable prediction of encounter rates and, in the case of density surface models, abundance that can then be applied to management scenarios. Specific human activities, OWED and shipping, are then compared within a tradeoff decision support framework, enabling interchangeable map and tradeoff plot views. These products make complex processes transparent for gaming conservation, industry and stakeholders towards optimal marine spatial management, fundamental to the tenets of marine spatial planning, ecosystem-based management and dynamic ocean management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The occurrences of visual hallucinations seem to be more prevalent in low light and hallucinators tend to be more prone to false positive type errors in memory tasks. Here we investigated whether the richness of stimuli does indeed affect recognition differently in hallucinating and nonhallucinating participants, and if so whether this difference extends to identifying spatial context. We compared 36 Parkinson's disease (PD) patients with visual hallucinations, 32 Parkinson's patients without hallucinations, and 36 age-matched controls, on a visual memory task where color and black and white pictures were presented at different locations. Participants had to recognize the pictures among distracters along with the location of the stimulus. Findings revealed clear differences in performance between the groups. Both PD groups had impaired recognition compared to the controls, but those with hallucinations were significantly more impaired on black and white than on color stimuli. In addition, the group with hallucinations was significantly impaired compared to the other two groups on spatial memory. We suggest that not only do PD patients have poorer recognition of pictorial stimuli than controls, those who present with visual hallucinations appear to be more heavily reliant on bottom up sensory input and impaired on spatial ability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An investigation into karst hazard in southern Ontario has been undertaken with the intention of leading to the development of predictive karst models for this region. The reason these are not currently feasible is a lack of sufficient karst data, though this is not entirely due to the lack of karst features. Geophysical data was collected at Lake on the Mountain, Ontario as part of this karst investigation. This data was collected in order to validate the long-standing hypothesis that Lake on the Mountain was formed from a sinkhole collapse. Sub-bottom acoustic profiling data was collected in order to image the lake bottom sediments and bedrock. Vertical bedrock features interpreted as solutionally enlarged fractures were taken as evidence for karst processes on the lake bottom. Additionally, the bedrock topography shows a narrower and more elongated basin than was previously identified, and this also lies parallel to a mapped fault system in the area. This suggests that Lake on the Mountain was formed over a fault zone which also supports the sinkhole hypothesis as it would provide groundwater pathways for karst dissolution to occur. Previous sediment cores suggest that Lake on the Mountain would have formed at some point during the Wisconsinan glaciation with glacial meltwater and glacial loading as potential contributing factors to sinkhole development. A probabilistic karst model for the state of Kentucky, USA, has been generated using the Weights of Evidence method. This model is presented as an example of the predictive capabilities of these kind of data-driven modelling techniques and to show how such models could be applied to karst in Ontario. The model was able to classify 70% of the validation dataset correctly while minimizing false positive identifications. This is moderately successful and could stand to be improved. Finally, suggestions to improving the current karst model of southern Ontario are suggested with the goal of increasing investigation into karst in Ontario and streamlining the reporting system for sinkholes, caves, and other karst features so as to improve the current Ontario karst database.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fluorescence in situ hybridization (FISH) for FOXO1 gene rearrangement and reverse transcription-polymerase chain reaction (PCR) for PAX3/7-FOXO1 fusion transcripts have become routine ancillary tools for the diagnosis of alveolar rhabdomyosarcomas (ARMS). Here we summarize our experience of these adjunct diagnostic modalities at a tertiary center, presenting the largest comparative series of FISH and PCR for suspected or possible ARMS to date. All suspected or possible ARMS tested by FISH or PCR for FOXO1 rearrangement or PAX3/7-FOXO1 fusion transcripts over a 7-year period were included. FISH and PCR results were correlated with clinical and histologic findings. One hundred samples from 95 patients had FISH and/or PCR performed. FISH had higher rates of technical success (96.8 %) compared with PCR (88 %). Where both tests were utilized successfully, there was high concordance rate between them (94.9 %). In 24 histologic ARMS tested for FISH or PCR, 83.3 % were translocation-positive (all for PAX3-FOXO1 by PCR) and included 3 histologic solid variants. In 76 cases where ARMS was excluded, there were 3 potential false-positive cases with FISH but none with PCR. PCR had similar sensitivity (85.7 %) and better specificity (100 %) in aiding the diagnosis of ARMS, compared with FISH (85 and 95.8 %, respectively). All solid variant ARMS harbored FOXO1 gene rearrangements and PAX3-FOXO1 ARMS were detected to the exclusion of PAX7-FOXO1. In comparative analysis, both FISH and PCR are useful in aiding the diagnosis of ARMS and excluding its sarcomatous mimics. FISH is more reliable technically but has less specificity than PCR. In cases where ARMS is in the differential diagnosis, it is optimal to perform both PCR and FISH: both have similar sensitivities for detecting ARMS, but FISH may confirm or exclude cases that are technically unsuccessful with PCR, while PCR can detect specific fusion transcripts that may be useful prognostically.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Although most gastrointestinal stromal tumours (GIST) carry oncogenic mutations in KIT exons 9, 11, 13 and 17, or in platelet-derived growth factor receptor alpha (PDGFRA) exons 12, 14 and 18, around 10% of GIST are free of these mutations. Genotyping and accurate detection of KIT/PDGFRA mutations in GIST are becoming increasingly useful for clinicians in the management of the disease. METHOD: To evaluate and improve laboratory practice in GIST mutation detection, we developed a mutational screening quality control program. Eleven laboratories were enrolled in this program and 50 DNA samples were analysed, each of them by four different laboratories, giving 200 mutational reports. RESULTS: In total, eight mutations were not detected by at least one laboratory. One false positive result was reported in one sample. Thus, the mean global rate of error with clinical implication based on 200 reports was 4.5%. Concerning specific polymorphisms detection, the rate varied from 0 to 100%, depending on the laboratory. The way mutations were reported was very heterogeneous, and some errors were detected. CONCLUSION: This study demonstrated that such a program was necessary for laboratories to improve the quality of the analysis, because an error rate of 4.5% may have clinical consequences for the patient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVE: The main difficulty of PCR-based clonality studies for B-cell lymphoproliferative disorders (B-LPD) is discrimination between monoclonal and polyclonal PCR products, especially when there is a high background of polyclonal B cells in the tumor sample. Actually, PCR-based methods for clonality assessment require additional analysis of the PCR products in order to discern between monoclonal and polyclonal samples. Heteroduplex analysis represents an attractive approach since it is easy to perform and avoids the use of radioactive substrates or expensive equipment. DESIGN AND METHODS: We studied the sensitivity and specificity of heteroduplex PCR analysis for monoclonal detection in samples from 90 B-cell non Hodgkin's lymphoma (B-NHL) patients and in 28 individuals without neoplastic B-cell disorders (negative controls). Furthermore, in 42 B-NHL and in the same 28 negative controls, we compared heteroduplex analysis vs the classical PCR technique. We also compared ethidium bromide (EtBr) vs. silver nitrate (AgNO(3)) staining as well as agarose vs. polyacrylamide gel electrophoresis (PAGE). RESULTS: Using two pair consensus primers sited at VH (FR3 and FR2) and at JH, 91% of B-NHL samples displayed monoclonal products after heteroduplex PCR analysis using PAGE and AgNO(3) staining. Moreover, no polyclonal sample showed a monoclonal PCR product. By contrast, false positive results were obtained when using agarose (5/28) and PAGE without heteroduplex analysis: 2/28 and 8/28 with EtBr and AgNO(3) staining, respectively. In addition, false negative results only appeared with EtBr staining: 13/42 in agarose, 4/42 in PAGE without heteroduplex analysis and 7/42 in PAGE after heteroduplex analysis. INTERPRETATION AND CONCLUSIONS: We conclude that AgNO(3) stained PAGE after heteroduplex analysis is the most suitable strategy for detecting monoclonal rearrangements in B-NHL samples because it does not produce false-positive results and the risk of false-negative results is very low.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVE: Molecular analysis by PCR of monoclonally rearranged immunoglobulin (Ig) genes can be used for diagnosis in B-cell lymphoproliferative disorders (LPD), as well as for monitoring minimal residual disease (MRD) after treatment. This technique has the risk of false-positive results due to the "background" amplification of similar rearrangements derived from polyclonal B-cells. This problem can be resolved in advance by additional analyses that discern between polyclonal and monoclonal PCR products, such as the heteroduplex analysis. A second problem is that PCR frequently fails to amplify the junction regions, mainly due to somatic mutations frequently present in mature (post-follicular) B-cell lymphoproliferations. The use of additional targets (e.g. Ig light chain genes) can avoid this problem. DESIGN AND METHODS: We studied the specificity of heteroduplex PCR analysis of several Ig junction regions to detect monoclonal products in samples from 84 MM patients and 24 patients with B cell polyclonal disorders. RESULTS: Using two distinct VH consensus primers (FR3 and FR2) in combination with one JH primer, 79% of the MM displayed monoclonal products. The percentage of positive cases was increased by amplification of the Vlamda-Jlamda junction regions or kappa(de) rearrangements, using two or five pairs of consensus primers, respectively. After including these targets in the heteroduplex PCR analysis, 93% of MM cases displayed monoclonal products. None of the polyclonal samples analyzed resulted in monoclonal products. Dilution experiments showed that monoclonal rearrangements could be detected with a sensitivity of at least 10(-2) in a background with >30% polyclonal B-cells, the sensitivity increasing up to 10(-3) when the polyclonal background was

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06