846 resultados para Tracking errors


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using a weighted up-down procedure, in each of eight conditions 28 participants compared durations of auditory (noise bursts) or visual (LED flashes) intervals; filled or unfilled with 3-ms markers; with or without feedback. Standards (Sts) were 100 and 1000 ms, and the ISI 900 ms. Intermixedly, presentation orders were St-Comparison (Co) and Co-St. TOEs were positive for St=100-ms and negative for St=1000 ms. Weber fractions (WFs, JND/St) were lowered by feedback. For visual-filled and visual-empty, WFs were highest for St=100 ms. For auditory-filled and visual-empty, St interacted with Order: lowest WFs occurred for St-Co with St=1000 ms, but for Co-St with St=100 ms. Lowest average WFs occurred with St-Co for visual-filled, but with Co-St for visual-empty. The results refute the generalization of better discrimination with St-Co than with Co-St (”type-B effect”), and support the notion of sensation weighting: flexibly differential impact weights of the compared durations in generating the response.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[1] In the event of a termination of the Gravity Recovery and Climate Experiment (GRACE) mission before the launch of GRACE Follow-On (due for launch in 2017), high-low satellite-to-satellite tracking (hl-SST) will be the only dedicated observing system with global coverage available to measure the time-variable gravity field (TVG) on a monthly or even shorter time scale. Until recently, hl-SST TVG observations were of poor quality and hardly improved the performance of Satellite Laser Ranging observations. To date, they have been of only very limited usefulness to geophysical or environmental investigations. In this paper, we apply a thorough reprocessing strategy and a dedicated Kalman filter to Challenging Minisatellite Payload (CHAMP) data to demonstrate that it is possible to derive the very long-wavelength TVG features down to spatial scales of approximately 2000 km at the annual frequency and for multi-year trends. The results are validated against GRACE data and surface height changes from long-term GPS ground stations in Greenland. We find that the quality of the CHAMP solutions is sufficient to derive long-term trends and annual amplitudes of mass change over Greenland. We conclude that hl-SST is a viable source of information for TVG and can serve to some extent to bridge a possible gap between the end-of-life of GRACE and the availability of GRACE Follow-On.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The COSMIC-2 mission is a follow-on mission of the Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) with an upgraded payload for improved radio occultation (RO) applications. The objective of this paper is to develop a near-real-time (NRT) orbit determination system, called NRT National Chiao Tung University (NCTU) system, to support COSMIC-2 in atmospheric applications and verify the orbit product of COSMIC. The system is capable of automatic determinations of the NRT GPS clocks and LEO orbit and clock. To assess the NRT (NCTU) system, we use eight days of COSMIC data (March 24-31, 2011), which contain a total of 331 GPS observation sessions and 12 393 RO observable files. The parallel scheduling for independent GPS and LEO estimations and automatic time matching improves the computational efficiency by 64% compared to the sequential scheduling. Orbit difference analyses suggest a 10-cm accuracy for the COSMIC orbits from the NRT (NCTU) system, and it is consistent as the NRT University Corporation for Atmospheric Research (URCA) system. The mean velocity accuracy from the NRT orbits of COSMIC is 0.168 mm/s, corresponding to an error of about 0.051 μrad in the bending angle. The rms differences in the NRT COSMIC clock and in GPS clocks between the NRT (NCTU) and the postprocessing products are 3.742 and 1.427 ns. The GPS clocks determined from a partial ground GPS network [from NRT (NCTU)] and a full one [from NRT (UCAR)] result in mean rms frequency stabilities of 6.1E-12 and 2.7E-12, respectively, corresponding to range fluctuations of 5.5 and 2.4 cm and bending angle errors of 3.75 and 1.66 μrad .

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Upper-air observations are a fundamental data source for global atmospheric data products, but uncertainties, particularly in the early years, are not well known. Most of the early observations, which have now been digitized, are prone to a large variety of undocumented uncertainties (errors) that need to be quantified, e.g., for their assimilation in reanalysis projects. We apply a novel approach to estimate errors in upper-air temperature, geopotential height, and wind observations from the Comprehensive Historical Upper-Air Network for the time period from 1923 to 1966. We distinguish between random errors, biases, and a term that quantifies the representativity of the observations. The method is based on a comparison of neighboring observations and is hence independent of metadata, making it applicable to a wide scope of observational data sets. The estimated mean random errors for all observations within the study period are 1.5 K for air temperature, 1.3 hPa for pressure, 3.0 ms−1for wind speed, and 21.4° for wind direction. The estimates are compared to results of previous studies and analyzed with respect to their spatial and temporal variability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable detection of JAK2-V617F is critical for accurate diagnosis of myeloproliferative neoplasms (MPNs); in addition, sensitive mutation-specific assays can be applied to monitor disease response. However, there has been no consistent approach to JAK2-V617F detection, with assays varying markedly in performance, affecting clinical utility. Therefore, we established a network of 12 laboratories from seven countries to systematically evaluate nine different DNA-based quantitative PCR (qPCR) assays, including those in widespread clinical use. Seven quality control rounds involving over 21,500 qPCR reactions were undertaken using centrally distributed cell line dilutions and plasmid controls. The two best-performing assays were tested on normal blood samples (n=100) to evaluate assay specificity, followed by analysis of serial samples from 28 patients transplanted for JAK2-V617F-positive disease. The most sensitive assay, which performed consistently across a range of qPCR platforms, predicted outcome following transplant, with the mutant allele detected a median of 22 weeks (range 6-85 weeks) before relapse. Four of seven patients achieved molecular remission following donor lymphocyte infusion, indicative of a graft vs MPN effect. This study has established a robust, reliable assay for sensitive JAK2-V617F detection, suitable for assessing response in clinical trials, predicting outcome and guiding management of patients undergoing allogeneic transplant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tourette Syndrome begins in childhood and is characterized by uncontrollable repetitive actions like neck craning or hopping and noises such as sniffing or chirping. Worst in early adolescence, these tics wax and wane in severity and occur in bouts unpredictably, often drawing unwanted attention from bystanders. Making matters worse, over half of children with Tourette Syndrome also suffer from comorbid, or concurrent, disorders such as attention deficit hyperactivity disorder (ADHD) and obsessive-compulsive disorder (OCD). These disorders introduce anxious thoughts, impulsivity, inattention, and mood variability that further disrupt children with Tourette Syndrome from focusing and performing well at school and home. Thus, deficits in the cognitive control functions of response inhibition, response generation, and working memory have long been ascribed to Tourette Syndrome. Yet, without considering the effect of medication, age, and comorbidity, this is a premature attribution. This study used an infrared eye tracking camera and various computer tasks requiring eye movement responses to evaluate response inhibition, response generation, and working memory in Tourette Syndrome. This study, the first to control for medication, age, and comorbidity, enrolled 39 unmedicated children with Tourette Syndrome and 29 typically developing peers aged 10-16 years who completed reflexive and voluntary eye movement tasks and diagnostic rating scales to assess symptom severities of Tourette Syndrome, ADHD, and OCD. Children with Tourette Syndrome and comorbid ADHD and/or OCD, but not children with Tourette Syndrome only, took longer to respond and made more errors and distracted eye movements compared to typically-developing children, displaying cognitive control deficits. However, increasing symptom severities of Tourette Syndrome, ADHD, and OCD correlated with one another. Thus, cognitive control deficits were not specific to Tourette Syndrome patients with comorbid conditions, but rather increase with increasing tic severity, suggesting that a majority of Tourette Syndrome patients, regardless of a clinical diagnosis of ADHD and/or OCD, have symptoms of cognitive control deficits at some level. Therefore, clinicians should evaluate and counsel all families of children with Tourette Syndrome, with or without currently diagnosed ADHD and/or OCD, about the functional ramifications of comorbid symptoms and that they may wax and wane with tic severity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One critical step in addressing and resolving the problems associated with human errors is the development of a cognitive taxonomy of such errors. In the case of errors, such a taxonomy may be developed (1) to categorize all types of errors along cognitive dimensions, (2) to associate each type of error with a specific underlying cognitive mechanism, (3) to explain why, and even predict when and where, a specific error will occur, and (4) to generate intervention strategies for each type of error. Based on Reason's (1992) definition of human errors and Norman's (1986) cognitive theory of human action, we have developed a preliminary action-based cognitive taxonomy of errors that largely satisfies these four criteria in the domain of medicine. We discuss initial steps for applying this taxonomy to develop an online medical error reporting system that not only categorizes errors but also identifies problems and generates solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many field or laboratory situations, well-mixed reservoirs like, for instance, injection or detection wells and gas distribution or sampling chambers define boundaries of transport domains. Exchange of solutes or gases across such boundaries can occur through advective or diffusive processes. First we analyzed situations, where the inlet region consists of a well-mixed reservoir, in a systematic way by interpreting them in terms of injection type. Second, we discussed the mass balance errors that seem to appear in case of resident injections. Mixing cells (MC) can be coupled mathematically in different ways to a domain where advective-dispersive transport occurs: by assuming a continuous solute flux at the interface (flux injection, MC-FI), or by assuming a continuous resident concentration (resident injection). In the latter case, the flux leaving the mixing cell can be defined in two ways: either as the value when the interface is approached from the mixing-cell side (MC-RT -), or as the value when it is approached from the column side (MC-RT +). Solutions of these injection types with constant or-in one case-distance-dependent transport parameters were compared to each other as well as to a solution of a two-layer system, where the first layer was characterized by a large dispersion coefficient. These solutions differ mainly at small Peclet numbers. For most real situations, the model for resident injection MC-RI + is considered to be relevant. This type of injection was modeled with a constant or with an exponentially varying dispersion coefficient within the porous medium. A constant dispersion coefficient will be appropriate for gases because of the Eulerian nature of the usually dominating gaseous diffusion coefficient, whereas the asymptotically growing dispersion coefficient will be more appropriate for solutes due to the Lagrangian nature of mechanical dispersion, which evolves only with the fluid flow. Assuming a continuous resident concentration at the interface between a mixing cell and a column, as in case of the MC-RI + model, entails a flux discontinuity. This flux discontinuity arises inherently from the definition of a mixing cell: the mixing process is included in the balance equation, but does not appear in the description of the flux through the mixing cell. There, only convection appears because of the homogeneous concentration within the mixing cell. Thus, the solute flux through a mixing cell in close contact with a transport domain is generally underestimated. This leads to (apparent) mass balance errors, which are often reported for similar situations and erroneously used to judge the validity of such models. Finally, the mixing cell model MC-RI + defines a universal basis regarding the type of solute injection at a boundary. Depending on the mixing cell parameters, it represents, in its limits, flux as well as resident injections. (C) 1998 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The variability of results from different automated methods of detection and tracking of extratropical cyclones is assessed in order to identify uncertainties related to the choice of method. Fifteen international teams applied their own algorithms to the same dataset - the period 1989-2009 of interim European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis (ERAInterim) data. This experiment is part of the community project Intercomparison of Mid Latitude Storm Diagnostics (IMILAST; see www.proclim.ch/imilast/index.html). The spread of results for cyclone frequency, intensity, life cycle, and track location is presented to illustrate the impact of using different methods. Globally, methods agree well for geographical distribution in large oceanic regions, interannual variability of cyclone numbers, geographical patterns of strong trends, and distribution shape for many life cycle characteristics. In contrast, the largest disparities exist for the total numbers of cyclones, the detection of weak cyclones, and distribution in some densely populated regions. Consistency between methods is better for strong cyclones than for shallow ones. Two case studies of relatively large, intense cyclones reveal that the identification of the most intense part of the life cycle of these events is robust between methods, but considerable differences exist during the development and the dissolution phases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For Northern Hemisphere extra-tropical cyclone activity, the dependency of a potential anthropogenic climate change signal on the identification method applied is analysed. This study investigates the impact of the used algorithm on the changing signal, not the robustness of the climate change signal itself. Using one single transient AOGCM simulation as standard input for eleven state-of-the-art identification methods, the patterns of model simulated present day climatologies are found to be close to those computed from re-analysis, independent of the method applied. Although differences in the total number of cyclones identified exist, the climate change signals (IPCC SRES A1B) in the model run considered are largely similar between methods for all cyclones. Taking into account all tracks, decreasing numbers are found in the Mediterranean, the Arctic in the Barents and Greenland Seas, the mid-latitude Pacific and North America. Changing patterns are even more similar, if only the most severe systems are considered: the methods reveal a coherent statistically significant increase in frequency over the eastern North Atlantic and North Pacific. We found that the differences between the methods considered are largely due to the different role of weaker systems in the specific methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hippocampus receives input from upper levels of the association cortex and is implicated in many mnemonic processes, but the exact mechanisms by which it codes and stores information is an unresolved topic. This work examines the flow of information through the hippocampal formation while attempting to determine the computations that each of the hippocampal subfields performs in learning and memory. The formation, storage, and recall of hippocampal-dependent memories theoretically utilize an autoassociative attractor network that functions by implementing two competitive, yet complementary, processes. Pattern separation, hypothesized to occur in the dentate gyrus (DG), refers to the ability to decrease the similarity among incoming information by producing output patterns that overlap less than the inputs. In contrast, pattern completion, hypothesized to occur in the CA3 region, refers to the ability to reproduce a previously stored output pattern from a partial or degraded input pattern. Prior to addressing the functional role of the DG and CA3 subfields, the spatial firing properties of neurons in the dentate gyrus were examined. The principal cell of the dentate gyrus, the granule cell, has spatially selective place fields; however, the behavioral correlates of another excitatory cell, the mossy cell of the dentate polymorphic layer, are unknown. This report shows that putative mossy cells have spatially selective firing that consists of multiple fields similar to previously reported properties of granule cells. Other cells recorded from the DG had single place fields. Compared to cells with multiple fields, cells with single fields fired at a lower rate during sleep, were less likely to burst, and were more likely to be recorded simultaneously with a large population of neurons that were active during sleep and silent during behavior. These data suggest that single-field and multiple-field cells constitute at least two distinct cell classes in the DG. Based on these characteristics, we propose that putative mossy cells tend to fire in multiple, distinct locations in an environment, whereas putative granule cells tend to fire in single locations, similar to place fields of the CA1 and CA3 regions. Experimental evidence supporting the theories of pattern separation and pattern completion comes from both behavioral and electrophysiological tests. These studies specifically focused on the function of each subregion and made implicit assumptions about how environmental manipulations changed the representations encoded by the hippocampal inputs. However, the cell populations that provided these inputs were in most cases not directly examined. We conducted a series of studies to investigate the neural activity in the entorhinal cortex, dentate gyrus, and CA3 in the same experimental conditions, which allowed a direct comparison between the input and output representations. The results show that the dentate gyrus representation changes between the familiar and cue altered environments more than its input representations, whereas the CA3 representation changes less than its input representations. These findings are consistent with longstanding computational models proposing that (1) CA3 is an associative memory system performing pattern completion in order to recall previous memories from partial inputs, and (2) the dentate gyrus performs pattern separation to help store different memories in ways that reduce interference when the memories are subsequently recalled.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing amounts of clinical research data are collected by manual data entry into electronic source systems and directly from research subjects. For this manual entered source data, common methods of data cleaning such as post-entry identification and resolution of discrepancies and double data entry are not feasible. However data accuracy rates achieved without these mechanisms may be higher than desired for a particular research use. We evaluated a heuristic usability method for utility as a tool to independently and prospectively identify data collection form questions associated with data errors. The method evaluated had a promising sensitivity of 64% and a specificity of 67%. The method was used as described in the literature for usability with no further adaptations or specialization for predicting data errors. We conclude that usability evaluation methodology should be further investigated for use in data quality assurance.