948 resultados para ERROR rates


Relevância:

60.00% 60.00%

Publicador:

Resumo:

We have examined the statistics of simulated bit-error rates in optical transmission systems with strong patterning effects and have found strong correlation between the probability of marks in a pseudorandom pattern and the error-free transmission distance. We discuss how a reduced density of marks can be achieved by preencoding optical data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Rotation invariance is important for an iris recognition system since changes of head orientation and binocular vergence may cause eye rotation. The conventional methods of iris recognition cannot achieve true rotation invariance. They only achieve approximate rotation invariance by rotating the feature vector before matching or unwrapping the iris ring at different initial angles. In these methods, the complexity of the method is increased, and when the rotation scale is beyond the certain scope, the error rates of these methods may substantially increase. In order to solve this problem, a new rotation invariant approach for iris feature extraction based on the non-separable wavelet is proposed in this paper. Firstly, a bank of non-separable orthogonal wavelet filters is used to capture characteristics of the iris. Secondly, a method of Markov random fields is used to capture rotation invariant iris feature. Finally, two-class kernel Fisher classifiers are adopted for classification. Experimental results on public iris databases show that the proposed approach has a low error rate and achieves true rotation invariance. © 2010.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62H30, 62P99

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: Technological devices such as smartphones and tablets are widely available and increasingly used as visual aids. This study evaluated the use of a novel app for tablets (MD_evReader) developed as a reading aid for individuals with a central field loss resulting from macular degeneration. The MD_evReader app scrolls text as single lines (similar to a news ticker) and is intended to enhance reading performance using the eccentric viewing technique by both reducing the demands on the eye movement system and minimising the deleterious effects of perceptual crowding. Reading performance with scrolling text was compared with reading static sentences, also presented on a tablet computer. Methods: Twenty-six people with low vision (diagnosis of macular degeneration) read static or dynamic text (scrolled from right to left), presented as a single line at high contrast on a tablet device. Reading error rates and comprehension were recorded for both text formats, and the participant’s subjective experience of reading with the app was assessed using a simple questionnaire. Results: The average reading speed for static and dynamic text was not significantly different and equal to or greater than 85 words per minute. The comprehension scores for both text formats were also similar, equal to approximately 95% correct. However, reading error rates were significantly (p=0.02) less for dynamic text than for static text. The participants’ questionnaire ratings of their reading experience with the MD_evReader were highly positive and indicated a preference for reading with this app compared with their usual method. Conclusions: Our data show that reading performance with scrolling text is at least equal to that achieved with static text and in some respects (reading error rate) is better than static text. Bespoke apps informed by an understanding of the underlying sensorimotor processes involved in a cognitive task such as reading have excellent potential as aids for people with visual impairments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The use of canines as a method of detection of explosives is well established worldwide and those applying this technology range from police forces and law enforcement to humanitarian agencies in the developing world. Despite the recent surge in publication of novel instrumental sensors for explosives detection, canines are still regarded by many to be the most effective real-time field method of explosives detection. However, unlike instrumental methods, currently it is difficult to determine detection levels, perform calibration of the canines' ability or produce scientifically valid quality control checks. Accordingly, amongst increasingly strict requirements regarding forensic evidence admission such as Frye and Daubert, there is a need for better scientific understanding of the process of canine detection. ^ When translated to the field of canine detection, just like any instrumental technique, peer reviewed publication of the reliability, success and error rates, is required for admissibility. Commonly training is focussed towards high explosives such as TNT and Composition 4, and the low explosives such as Black and Smokeless Powders are added often only for completeness. ^ Headspace analyses of explosive samples, performed by Solid Phase Microextraction (SPME) paired with Gas Chromatography - Mass Spectrometry (GC-MS), and Gas Chromatography - Electron Capture Detection (GC-ECD) was conducted, highlighting common odour chemicals. The odour chemicals detected were then presented to previously trained and certified explosives detection canines, and the activity/inactivity of the odour determined through field trials and experiments. ^ It was demonstrated that TNT and cast explosives share a common odour signature, and the same may be said for plasticized explosives such as Composition C-4 and Deta Sheet. Conversely, smokeless powders were demonstrated not to share common odours. An evaluation of the effectiveness of commercially available pseudo aids reported limited success. The implications of the explosive odour studies upon canine training then led to the development of novel inert training aids based upon the active odours determined. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A trial judge serves as gatekeeper in the courtroom to ensure that only reliable expert witness testimony is presented to the jury. Nevertheless, research shows that while judges take seriously their gatekeeper status, legal professionals in general are unable to identify well conducted research and are unable to define falsifiability, error rates, peer review status, and scientific validity (Gatkowski et al., 2001; Kovera & McAuliff, 2000). However, the abilities to identify quality scientific research and define scientific concepts are critical to preventing "junk" science from entering courtrooms. Research thus far has neglected to address that before selecting expert witnesses, judges and attorneys must first evaluate experts' CVs rather than their scientific testimony to determine whether legal standards of admissibility have been met. The quality of expert testimony, therefore, largely depends on the ability to evaluate properly experts' credentials. Theoretical models of decision making suggest that ability/knowledge and motivation are required to process information systematically. Legal professionals (judges and attorneys) were expected to process CVs heuristically when rendering expert witness decisions due to a lack of training in areas of psychology expertise.^ Legal professionals' (N = 150) and undergraduate students' (N = 468) expert witness decisions were examined and compared. Participants were presented with one of two versions of a criminal case calling for the testimony of either a clinical psychology expert or an experimental legal psychology expert. Participants then read one of eight curricula vitae that varied area of expertise (clinical vs. legal psychology), previous expert witness experience (previous experience vs. no previous experience), and scholarly publication record (30 publications vs. no publications) before deciding whether the expert was qualified to testify in the case. Follow-up measures assessed participants' decision making processes.^ Legal professionals were not better than college students at rendering quality psychology expert witness admissibility decisions yet they were significantly more confident in their decisions. Legal professionals rated themselves significantly higher than students in ability, knowledge, and motivation to choose an appropriate psychology expert although their expert witness decisions were equally inadequate. Findings suggest that participants relied on heuristics, such as previous expert witness experience, to render decisions.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Elemental analysis can become an important piece of evidence to assist the solution of a case. The work presented in this dissertation aims to evaluate the evidential value of the elemental composition of three particular matrices: ink, paper and glass. In the first part of this study, the analytical performance of LIBS and LA-ICP-MS methods was evaluated for paper, writing inks and printing inks. A total of 350 ink specimens were examined including black and blue gel inks, ballpoint inks, inkjets and toners originating from several manufacturing sources and/or batches. The paper collection set consisted of over 200 paper specimens originating from 20 different paper sources produced by 10 different plants. Micro-homogeneity studies show smaller variation of elemental compositions within a single source (i.e., sheet, pen or cartridge) than the observed variation between different sources (i.e., brands, types, batches). Significant and detectable differences in the elemental profile of the inks and paper were observed between samples originating from different sources (discrimination of 87–100% of samples, depending on the sample set under investigation and the method applied). These results support the use of elemental analysis, using LA-ICP-MS and LIBS, for the examination of documents and provide additional discrimination to the currently used techniques in document examination. In the second part of this study, a direct comparison between four analytical methods (µ-XRF, solution-ICP-MS, LA-ICP-MS and LIBS) was conducted for glass analyses using interlaboratory studies. The data provided by 21 participants were used to assess the performance of the analytical methods in associating glass samples from the same source and differentiating different sources, as well as the use of different match criteria (confidence interval (±6s, ±5s, ±4s, ±3s, ±2s), modified confidence interval, t-test (sequential univariate, p=0.05 and p=0.01), t-test with Bonferroni correction (for multivariate comparisons), range overlap, and Hotelling's T2 tests. Error rates (Type 1 and Type 2) are reported for the use of each of these match criteria and depend on the heterogeneity of the glass sources, the repeatability between analytical measurements, and the number of elements that were measured. The study provided recommendations for analytical performance-based parameters for µ-XRF and LA-ICP-MS as well as the best performing match criteria for both analytical techniques, which can be applied now by forensic glass examiners.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

SmartWater is a chemical taggant used as a crime deterrent. The chemical taggant is a colorless liquid that fluoresces yellow under ultra-violet (UV) light and contains distinctive, identifiable and traceable elemental composition. For instance, upon a break and entry scenario, the burglar is sprayed with a solution that has an elemental signature custom-made to a specific location. The residues of this taggant persist on skin and other objects and can be easily recovered for further analysis. The product has been effectively used in Europe as a crime deterrent and has been recently introduced in South Florida. In 2014, Fourt Lauderdale Police Department reported the use of SmartWater products with a reduction in burglaries of 14% [1]. The International Forensic Research Institute (IFRI) at FIU validated the scientific foundation of the methods of recovery and analysis of these chemical tagging systems using LA-ICP-MS. Analytical figures of merit of the method such as precision, accuracy, limits of detection, linearity and selectivity are reported in this study. Moreover, blind samples were analyzed by LA-ICP-MS to compare the chemical signatures to the company’s database and evaluate error rates and the accuracy of the method. This study demonstrated that LA-ICP-MS could be used to effectively detect these traceable taggants to assist law enforcement agencies in the United States with cases involving transfer of these forensic coding systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Omnibus tests of significance in contingency tables use statistics of the chi-square type. When the null is rejected, residual analyses are conducted to identify cells in which observed frequencies differ significantly from expected frequencies. Residual analyses are thus conditioned on a significant omnibus test. Conditional approaches have been shown to substantially alter type I error rates in cases involving t tests conditional on the results of a test of equality of variances, or tests of regression coefficients conditional on the results of tests of heteroscedasticity. We show that residual analyses conditional on a significant omnibus test are also affected by this problem, yielding type I error rates that can be up to 6 times larger than nominal rates, depending on the size of the table and the form of the marginal distributions. We explored several unconditional approaches in search for a method that maintains the nominal type I error rate and found out that a bootstrap correction for multiple testing achieved this goal. The validity of this approach is documented for two-way contingency tables in the contexts of tests of independence, tests of homogeneity, and fitting psychometric functions. Computer code in MATLAB and R to conduct these analyses is provided as Supplementary Material.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Human use of the oceans is increasingly in conflict with conservation of endangered species. Methods for managing the spatial and temporal placement of industries such as military, fishing, transportation and offshore energy, have historically been post hoc; i.e. the time and place of human activity is often already determined before assessment of environmental impacts. In this dissertation, I build robust species distribution models in two case study areas, US Atlantic (Best et al. 2012) and British Columbia (Best et al. 2015), predicting presence and abundance respectively, from scientific surveys. These models are then applied to novel decision frameworks for preemptively suggesting optimal placement of human activities in space and time to minimize ecological impacts: siting for offshore wind energy development, and routing ships to minimize risk of striking whales. Both decision frameworks relate the tradeoff between conservation risk and industry profit with synchronized variable and map views as online spatial decision support systems.

For siting offshore wind energy development (OWED) in the U.S. Atlantic (chapter 4), bird density maps are combined across species with weights of OWED sensitivity to collision and displacement and 10 km2 sites are compared against OWED profitability based on average annual wind speed at 90m hub heights and distance to transmission grid. A spatial decision support system enables toggling between the map and tradeoff plot views by site. A selected site can be inspected for sensitivity to a cetaceans throughout the year, so as to capture months of the year which minimize episodic impacts of pre-operational activities such as seismic airgun surveying and pile driving.

Routing ships to avoid whale strikes (chapter 5) can be similarly viewed as a tradeoff, but is a different problem spatially. A cumulative cost surface is generated from density surface maps and conservation status of cetaceans, before applying as a resistance surface to calculate least-cost routes between start and end locations, i.e. ports and entrance locations to study areas. Varying a multiplier to the cost surface enables calculation of multiple routes with different costs to conservation of cetaceans versus cost to transportation industry, measured as distance. Similar to the siting chapter, a spatial decisions support system enables toggling between the map and tradeoff plot view of proposed routes. The user can also input arbitrary start and end locations to calculate the tradeoff on the fly.

Essential to the input of these decision frameworks are distributions of the species. The two preceding chapters comprise species distribution models from two case study areas, U.S. Atlantic (chapter 2) and British Columbia (chapter 3), predicting presence and density, respectively. Although density is preferred to estimate potential biological removal, per Marine Mammal Protection Act requirements in the U.S., all the necessary parameters, especially distance and angle of observation, are less readily available across publicly mined datasets.

In the case of predicting cetacean presence in the U.S. Atlantic (chapter 2), I extracted datasets from the online OBIS-SEAMAP geo-database, and integrated scientific surveys conducted by ship (n=36) and aircraft (n=16), weighting a Generalized Additive Model by minutes surveyed within space-time grid cells to harmonize effort between the two survey platforms. For each of 16 cetacean species guilds, I predicted the probability of occurrence from static environmental variables (water depth, distance to shore, distance to continental shelf break) and time-varying conditions (monthly sea-surface temperature). To generate maps of presence vs. absence, Receiver Operator Characteristic (ROC) curves were used to define the optimal threshold that minimizes false positive and false negative error rates. I integrated model outputs, including tables (species in guilds, input surveys) and plots (fit of environmental variables, ROC curve), into an online spatial decision support system, allowing for easy navigation of models by taxon, region, season, and data provider.

For predicting cetacean density within the inner waters of British Columbia (chapter 3), I calculated density from systematic, line-transect marine mammal surveys over multiple years and seasons (summer 2004, 2005, 2008, and spring/autumn 2007) conducted by Raincoast Conservation Foundation. Abundance estimates were calculated using two different methods: Conventional Distance Sampling (CDS) and Density Surface Modelling (DSM). CDS generates a single density estimate for each stratum, whereas DSM explicitly models spatial variation and offers potential for greater precision by incorporating environmental predictors. Although DSM yields a more relevant product for the purposes of marine spatial planning, CDS has proven to be useful in cases where there are fewer observations available for seasonal and inter-annual comparison, particularly for the scarcely observed elephant seal. Abundance estimates are provided on a stratum-specific basis. Steller sea lions and harbour seals are further differentiated by ‘hauled out’ and ‘in water’. This analysis updates previous estimates (Williams & Thomas 2007) by including additional years of effort, providing greater spatial precision with the DSM method over CDS, novel reporting for spring and autumn seasons (rather than summer alone), and providing new abundance estimates for Steller sea lion and northern elephant seal. In addition to providing a baseline of marine mammal abundance and distribution, against which future changes can be compared, this information offers the opportunity to assess the risks posed to marine mammals by existing and emerging threats, such as fisheries bycatch, ship strikes, and increased oil spill and ocean noise issues associated with increases of container ship and oil tanker traffic in British Columbia’s continental shelf waters.

Starting with marine animal observations at specific coordinates and times, I combine these data with environmental data, often satellite derived, to produce seascape predictions generalizable in space and time. These habitat-based models enable prediction of encounter rates and, in the case of density surface models, abundance that can then be applied to management scenarios. Specific human activities, OWED and shipping, are then compared within a tradeoff decision support framework, enabling interchangeable map and tradeoff plot views. These products make complex processes transparent for gaming conservation, industry and stakeholders towards optimal marine spatial management, fundamental to the tenets of marine spatial planning, ecosystem-based management and dynamic ocean management.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Atomic ions trapped in micro-fabricated surface traps can be utilized as a physical platform with which to build a quantum computer. They possess many of the desirable qualities of such a device, including high fidelity state preparation and readout, universal logic gates, long coherence times, and can be readily entangled with each other through photonic interconnects. The use of optical cavities integrated with trapped ion qubits as a photonic interface presents the possibility for order of magnitude improvements in performance in several key areas of their use in quantum computation. The first part of this thesis describes the design and fabrication of a novel surface trap for integration with an optical cavity. The trap is custom made on a highly reflective mirror surface and includes the capability of moving the ion trap location along all three trap axes with nanometer scale precision. The second part of this thesis demonstrates the suitability of small micro-cavities formed from laser ablated fused silica substrates with radii of curvature in the 300-500 micron range for use with the mirror trap as part of an integrated ion trap cavity system. Quantum computing applications for such a system include dramatic improvements in the photonic entanglement rate up to 10 kHz, the qubit measurement time down to 1 microsecond, and the measurement error rates down to the 10e-5 range. The final part of this thesis details a performance simulator for exploring the physical resource requirements and performance demands to scale such a quantum computer to sizes capable of performing quantum algorithms beyond the limits of classical computation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This talk explores how the runtime system and operating system can leverage metrics that express the significance and resilience of application components in order to reduce the energy footprint of parallel applications. We will explore in particular how software can tolerate and indeed exploit higher error rates in future processors and memory technologies that may operate outside their safe margins.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Three types of forecasts of the total Australian production of macadamia nuts (t nut-in-shell) have been produced early each year since 2001. The first is a long-term forecast, based on the expected production from the tree census data held by the Australian Macadamia Society, suitably scaled up for missing data and assumed new plantings each year. These long-term forecasts range out to 10 years in the future, and form a basis for industry and market planning. Secondly, a statistical adjustment (termed the climate-adjusted forecast) is made annually for the coming crop. As the name suggests, climatic influences are the dominant factors in this adjustment process, however, other terms such as bienniality of bearing, prices and orchard aging are also incorporated. Thirdly, industry personnel are surveyed early each year, with their estimates integrated into a growers and pest-scouts forecast. Initially conducted on a 'whole-country' basis, these models are now constructed separately for the six main production regions of Australia, with these being combined for national totals. Ensembles or suites of step-forward regression models using biologically-relevant variables have been the major statistical method adopted, however, developing methodologies such as nearest-neighbour techniques, general additive models and random forests are continually being evaluated in parallel. The overall error rates average 14% for the climate forecasts, and 12% for the growers' forecasts. These compare with 7.8% for USDA almond forecasts (based on extensive early-crop sampling) and 6.8% for coconut forecasts in Sri Lanka. However, our somewhatdisappointing results were mainly due to a series of poor crops attributed to human reasons, which have now been factored into the models. Notably, the 2012 and 2013 forecasts averaged 7.8 and 4.9% errors, respectively. Future models should also show continuing improvement, as more data-years become available.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Elemental analysis can become an important piece of evidence to assist the solution of a case. The work presented in this dissertation aims to evaluate the evidential value of the elemental composition of three particular matrices: ink, paper and glass. In the first part of this study, the analytical performance of LIBS and LA-ICP-MS methods was evaluated for paper, writing inks and printing inks. A total of 350 ink specimens were examined including black and blue gel inks, ballpoint inks, inkjets and toners originating from several manufacturing sources and/or batches. The paper collection set consisted of over 200 paper specimens originating from 20 different paper sources produced by 10 different plants. Micro-homogeneity studies show smaller variation of elemental compositions within a single source (i.e., sheet, pen or cartridge) than the observed variation between different sources (i.e., brands, types, batches). Significant and detectable differences in the elemental profile of the inks and paper were observed between samples originating from different sources (discrimination of 87 – 100% of samples, depending on the sample set under investigation and the method applied). These results support the use of elemental analysis, using LA-ICP-MS and LIBS, for the examination of documents and provide additional discrimination to the currently used techniques in document examination. In the second part of this study, a direct comparison between four analytical methods (µ-XRF, solution-ICP-MS, LA-ICP-MS and LIBS) was conducted for glass analyses using interlaboratory studies. The data provided by 21 participants were used to assess the performance of the analytical methods in associating glass samples from the same source and differentiating different sources, as well as the use of different match criteria (confidence interval (±6s, ±5s, ±4s, ±3s, ±2s), modified confidence interval, t-test (sequential univariate, p=0.05 and p=0.01), t-test with Bonferroni correction (for multivariate comparisons), range overlap, and Hotelling’s T2 tests. Error rates (Type 1 and Type 2) are reported for the use of each of these match criteria and depend on the heterogeneity of the glass sources, the repeatability between analytical measurements, and the number of elements that were measured. The study provided recommendations for analytical performance-based parameters for µ-XRF and LA-ICP-MS as well as the best performing match criteria for both analytical techniques, which can be applied now by forensic glass examiners.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Prevalent face recognition difficulties in Alzheimer’s disease (AD) have typically been attributed to the underlying episodic and semantic memory impairment. The aim of the current study was to determine if AD patients are also impaired at the perceptual level for faces, more specifically at extracting a visual representation of an individual face. To address this question, we investigated the matching of simultaneously presented individual faces and of other nonface familiar shapes (cars), at both upright and inverted orientation, in a group of mild AD patients and in a group of healthy older controls matched for age and education. AD patients showed a reduced inversion effect (i.e., larger performance for upright than inverted stimuli) for faces, but not for cars, both in terms of error rates and response times. While healthy participants showed a much larger decrease in performance for faces than for cars with inversion, the inversion effect did not differ significantly for faces and cars in AD. This abnormal inversion effect for faces was observed in a large subset of individual patients with AD. These results suggest that AD patients have deficits in higher-level visual processes, more specifically at perceiving individual faces, a function that relies on holistic representations specific to upright face stimuli. These deficits, combined with their memory impairment, may contribute to the difficulties in recognizing familiar people that are often reported in patients suffering from the disease and by their caregivers.