942 resultados para Subpixel precision
Resumo:
The Intoxilyzer 5000 was tested for calibration curve linearity for ethanol vapor concentration between 0.020 and 0.400g/210L with excellent linearity. Calibration error using reference solutions outside of the allowed concentration range, response to the same ethanol reference solution at different temperatures between 34 and 38$\sp\circ$C, and its response to eleven chemicals, 10 mixtures of two at the time, and one mixture of four chemicals potentially found in human breath have been evaluated. Potential interferents were chosen on the basis of their infrared signatures and the concentration range of solutions corresponding to the non-lethal blood concentration range of various volatile organic compounds reported in the literature. The result of this study indicates that the instrument calibrates with solutions outside the allowed range up to $\pm$10% of target value. Headspace FID dual column GC analysis was used to confirm the concentrations of the solutions. Increasing the temperature of the reference solution from 34 to 38$\sp\circ$C resulted in linear increases in instrument recorded ethanol readings with an average increase of 6.25%/$\sp\circ$C. Of the eleven chemicals studied during this experiment, six, isopropanol, toluene, methyl ethyl ketone, trichloroethylene, acetaldehyde, and methanol could reasonably interfere with the test at non-lethal reported blood concentration ranges, the mixtures of those six chemicals showed linear additive results with a combined effect of as much as a 0.080g/210L reading (Florida's legal limit) without any ethanol present. ^
Resumo:
Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. ^ This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. ^ Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded. ^
Resumo:
The authors thank Professor Iber^e Luiz Caldas for the suggestions and encouragement. The authors F.F.G.d.S., R.M.R., J.C.S., and H.A.A. acknowledge the Brazilian agency CNPq and state agencies FAPEMIG, FAPESP, and FAPESC, and M.S.B. also acknowledges the EPSRC Grant Ref. No. EP/I032606/1.
Resumo:
Precision medicine is an emerging approach to disease treatment and prevention that considers variability in patient genes, environment, and lifestyle. However, little has been written about how such research impacts emergency care. Recent advances in analytical techniques have made it possible to characterize patients in a more comprehensive and sophisticated fashion at the molecular level, promising highly individualized diagnosis and treatment. Among these techniques are various systematic molecular phenotyping analyses (e.g., genomics, transcriptomics, proteomics, and metabolomics). Although a number of emergency physicians use such techniques in their research, widespread discussion of these approaches has been lacking in the emergency care literature and many emergency physicians may be unfamiliar with them. In this article, we briefly review the underpinnings of such studies, note how they already impact acute care, discuss areas in which they might soon be applied, and identify challenges in translation to the emergency department (ED). While such techniques hold much promise, it is unclear whether the obstacles to translating their findings to the ED will be overcome in the near future. Such obstacles include validation, cost, turnaround time, user interface, decision support, standardization, and adoption by end-users.
Resumo:
Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded.
Resumo:
Ignoring small-scale heterogeneities in Arctic land cover may bias estimates of water, heat and carbon fluxes in large-scale climate and ecosystem models. We investigated subpixel-scale heterogeneity in CHRIS/PROBA and Landsat-7 ETM+ satellite imagery over ice-wedge polygonal tundra in the Lena Delta of Siberia, and the associated implications for evapotranspiration (ET) estimation. Field measurements were combined with aerial and satellite data to link fine-scale (0.3 m resolution) with coarse-scale (upto 30 m resolution) land cover data. A large portion of the total wet tundra (80%) and water body area (30%) appeared in the form of patches less than 0.1 ha in size, which could not be resolved with satellite data. Wet tundra and small water bodies represented about half of the total ET in summer. Their contribution was reduced to 20% in fall, during which ET rates from dry tundra were highest instead. Inclusion of subpixel-scale water bodies increased the total water surface area of the Lena Delta from 13% to 20%. The actual land/water proportions within each composite satellite pixel was best captured with Landsat data using a statistical downscaling approach, which is recommended for reliable large-scale modelling of water, heat and carbon exchange from permafrost landscapes.
Resumo:
In his last two State of the Union addresses, President Barack Obama has focused on the need to deliver innovative solutions to improve human health, through the Precision Medicine Initiative in 2015 and the recently announced Cancer Moonshot in 2016. Precision cancer care has delivered clear patient benefit, but even for high-impact medicines such as imatinib mesylate (Glivec) in chronic myeloid leukaemia, the excitement at the success of this practice-changing clinical intervention has been somewhat tempered by the escalating price of this 'poster child' for precision cancer medicine (PCM). Recent studies on the costs of cancer drugs have revealed significant price differentials, which are a major causative factor behind disparities in the access to new generations of immunological and molecularly targeted agents. In this perspective, we will discuss the benefits of PCM to modern cancer control, but also emphasise how increasing costs are rendering the current approaches to integrating the paradigm of PCM unsustainable. Despite the ever increasing pressure on cancer and health care budgets, innovation will and must continue. Value-based frameworks offer one of the most rational approaches for policymakers committed to improving cancer outcomes through a public health approach.
Resumo:
Temporal replicate counts are often aggregated to improve model fit by reducing zero-inflation and count variability, and in the case of migration counts collected hourly throughout a migration, allows one to ignore nonindependence. However, aggregation can represent a loss of potentially useful information on the hourly or seasonal distribution of counts, which might impact our ability to estimate reliable trends. We simulated 20-year hourly raptor migration count datasets with known rate of change to test the effect of aggregating hourly counts to daily or annual totals on our ability to recover known trend. We simulated data for three types of species, to test whether results varied with species abundance or migration strategy: a commonly detected species, e.g., Northern Harrier, Circus cyaneus; a rarely detected species, e.g., Peregrine Falcon, Falco peregrinus; and a species typically counted in large aggregations with overdispersed counts, e.g., Broad-winged Hawk, Buteo platypterus. We compared accuracy and precision of estimated trends across species and count types (hourly/daily/annual) using hierarchical models that assumed a Poisson, negative binomial (NB) or zero-inflated negative binomial (ZINB) count distribution. We found little benefit of modeling zero-inflation or of modeling the hourly distribution of migration counts. For the rare species, trends analyzed using daily totals and an NB or ZINB data distribution resulted in a higher probability of detecting an accurate and precise trend. In contrast, trends of the common and overdispersed species benefited from aggregation to annual totals, and for the overdispersed species in particular, trends estimating using annual totals were more precise, and resulted in lower probabilities of estimating a trend (1) in the wrong direction, or (2) with credible intervals that excluded the true trend, as compared with hourly and daily counts.
Resumo:
The phase difference principle is widely applied nowadays to sonar systems used for sea floor bathymetry, The apparent angle of a target point is obtained from the phase difference measured between two close receiving arrays. Here we study the influence of the phase difference estimation errors caused by the physical structure of the backscattered signals. It is shown that, under certain current conditions, beyond the commonly considered effects of additive external noise and baseline decorrelation, the processing may be affected by the shifting footprint effect: this is due to the fact that the two interferometer receivers get simultaneous echo contributions coming from slightly shifted seabed parts, which results in a degradation of the signal coherence and, hence, of the phase difference measurement. This geometrical effect is described analytically and checked with numerical simulations, both for square- and sine-shaped signal envelopes. Its relative influence depends on the geometrical configuration and receiver spacing; it may be prevalent in practical cases associated with bathymetric sonars. The cases of square and smooth signal envelopes are both considered. The measurements close to nadir, which are known to be especially difficult with interferometry systems, are addressed in particular.
Resumo:
Manipulation of single cells and particles is important to biology and nanotechnology. Our electrokinetic (EK) tweezers manipulate objects in simple microfluidic devices using gentle fluid and electric forces under vision-based feedback control. In this dissertation, I detail a user-friendly implementation of EK tweezers that allows users to select, position, and assemble cells and nanoparticles. This EK system was used to measure attachment forces between living breast cancer cells, trap single quantum dots with 45 nm accuracy, build nanophotonic circuits, and scan optical properties of nanowires. With a novel multi-layer microfluidic device, EK was also used to guide single microspheres along complex 3D trajectories. The schemes, software, and methods developed here can be used in many settings to precisely manipulate most visible objects, assemble objects into useful structures, and improve the function of lab-on-a-chip microfluidic systems.
Resumo:
We describe a new geometry for electrostatic actuators to be used in sensitive laser interferometers, suited for prototype and table top experiments related to gravitational wave detection with mirrors of 100 g or less. The arrangement consists of two plates at the sides of the mirror (test mass), and therefore does not reduce its clear aperture as a conventional electrostatic drive (ESD) would do. Using the sample case of the AEI-10 m prototype interferometer, we investigate the actuation range and the influence of the relative misalignment of the ESD plates with respect to the test mass. We find that in the case of the AEI-10 m prototype interferometer, this new kind of ESD could provide a range of 0.28 mu m when operated at a voltage of 1 kV. In addition, the geometry presented is shown to provide a reduction factor of about 100 in the magnitude of the actuator motion coupling to the test mass displacement. We show that therefore in the specific case of the AEI-10 m interferometer, it is possible to mount the ESD actuators directly on the optical table without spoiling the seismic isolation performance of the triple stage suspension of the main test masses.
Resumo:
A basic requirement of a plasma etching process is fidelity of the patterned organic materials. In photolithography, a He plasma pretreatment (PPT) based on high ultraviolet and vacuum ultraviolet (UV/VUV) exposure was shown to be successful for roughness reduction of 193nm photoresist (PR). Typical multilayer masks consist of many other organic masking materials in addition to 193nm PR. These materials vary significantly in UV/VUV sensitivity and show, therefore, a different response to the He PPT. A delamination of the nanometer-thin, ion-induced dense amorphous carbon (DAC) layer was observed. Extensive He PPT exposure produces volatile species through UV/VUV induced scissioning. These species are trapped underneath the DAC layer in a subsequent plasma etch (PE), causing a loss of adhesion. Next to stabilizing organic materials, the major goals of this work included to establish and evaluate a cyclic fluorocarbon (FC) based approach for atomic layer etching (ALE) of SiO2 and Si; to characterize the mechanisms involved; and to evaluate the impact of processing parameters. Periodic, short precursor injections allow precise deposition of thin FC films. These films limit the amount of available chemical etchant during subsequent low energy, plasma-based Ar+ ion bombardment, resulting in strongly time-dependent etch rates. In situ ellipsometry showcased the self-limited etching. X-ray photoelectron spectroscopy (XPS) confirms FC film deposition and mixing with the substrate. The cyclic ALE approach is also able to precisely etch Si substrates. A reduced time-dependent etching is seen for Si, likely based on a lower physical sputtering energy threshold. A fluorinated, oxidized surface layer is present during ALE of Si and greatly influences the etch behavior. A reaction of the precursor with the fluorinated substrate upon precursor injection was observed and characterized. The cyclic ALE approach is transferred to a manufacturing scale reactor at IBM Research. Ensuring the transferability to industrial device patterning is crucial for the application of ALE. In addition to device patterning, the cyclic ALE process is employed for oxide removal from Si and SiGe surfaces with the goal of minimal substrate damage and surface residues. The ALE process developed for SiO2 and Si etching did not remove native oxide at the level required. Optimizing the process enabled strong O removal from the surface. Subsequent 90% H2/Ar plasma allow for removal of C and F residues.
Resumo:
My study investigated internal consistency estimates of psychometric surveys as an operationalization of the state of measurement precision of constructs in industrial and organizational (I/O) psychology. Analyses were conducted of samples used in research articles published in the Journal of Applied Psychology between 1975 and 2010 in five year intervals (K = 934) from 480 articles yielding 1427 coefficients. Articles and their respective samples were coded for test-taker characteristics (e.g., age, gender, and ethnicity), research settings (e.g., lab and field studies), and actual tests (e.g., number of items and scale anchor points). A reliability and inter-item correlations depository was developed for I/O variables and construct groups. Personality measures had significantly lower inter-item correlations than other construct groups. Also, internal consistency estimates and reporting practices were evaluated over time, demonstrating an improvement in measurement precision and missing data.