987 resultados para computer reliability


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ultra Wide Band (UWB) transmission has recently been the object of considerable attention in the field of next generation location aware wireless sensor networks. This is due to its fine time resolution, energy efficient and robustness to interference in harsh environments. This paper presents a thorough applied examination of prototype IEEE 802.15.4a impulse UWB transceiver technology to quantify the effect of line of sight (LOS) and non line of sight (NLOS) ranging in real indoor and outdoor environments. Results included draw on an extensive array of experiments that fully characterize the 802.15.4a UWB transceiver technology, its reliability and ranging capabilities for the first time. A new two way (TW) ranging protocol is proposed. The goal of this work is to validate the technology as a dependable wireless communications mechanism for the subset of sensor network localization applications where reliability and precision positions are key concerns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Buried heat sources can be investigated by examining thermal infrared images and comparing these with the results of theoretical models which predict the thermal anomaly a given heat source may generate. Key factors influencing surface temperature include the geometry and temperature of the heat source, the surface meteorological environment, and the thermal conductivity and anisotropy of the rock. In general, a geothermal heat flux of greater than 2% of solar insolation is required to produce a detectable thermal anomaly in a thermal infrared image. A heat source of, for example, 2-300K greater than the average surface temperature must be a t depth shallower than 50m for the detection of the anomaly in a thermal infrared image, for typical terrestrial conditions. Atmospheric factors are of critical importance. While the mean atmospheric temperature has little significance, the convection is a dominant factor, and can act to swamp the thermal signature entirely. Given a steady state heat source that produces a detectable thermal anomaly, it is possible to loosely constrain the physical properties of the heat source and surrounding rock, using the surface thermal anomaly as a basis. The success of this technique is highly dependent on the degree to which the physical properties of the host rock are known. Important parameters include the surface thermal properties and thermal conductivity of the rock. Modelling of transient thermal situations was carried out, to assess the effect of time dependant thermal fluxes. One-dimensional finite element models can be readily and accurately applied to the investigation of diurnal heat flow, as with thermal inertia models. Diurnal thermal models of environments on Earth, the Moon and Mars were carried out using finite elements and found to be consistent with published measurements. The heat flow from an injection of hot lava into a near surface lava tube was considered. While this approach was useful for study, and long term monitoring in inhospitable areas, it was found to have little hazard warning utility, as the time taken for the thermal energy to propagate to the surface in dry rock (several months) in very long. The resolution of the thermal infrared imaging system is an important factor. Presently available satellite based systems such as Landsat (resolution of 120m) are inadequate for detailed study of geothermal anomalies. Airborne systems, such as TIMS (variable resolution of 3-6m) are much more useful for discriminating small buried heat sources. Planned improvements in the resolution of satellite based systems will broaden the potential for application of the techniques developed in this thesis. It is important to note, however, that adequate spatial resolution is a necessary but not sufficient condition for successful application of these techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study sets out to investigate the psychology of immersion and the immersive response of individuals in relation to video and computer games. Initially, an exhaustive review of literature is presented, including research into games, player demographics, personality and identity. Play in traditional psychology is also reviewed, as well as previous research into immersion and attempts to define and measure this construct. An online qualitative study was carried out (N=38), and data was analysed using content analysis. A definition of immersion emerged, as well as a classification of two separate types of immersion, namely, vicarious immersion and visceral immersion. A survey study (N=217) verified the discrete nature of these categories and rejected the null hypothesis that there was no difference between individuals' interpretations of vicarious and visceral immersion. The primary aim of this research was to create a quantitative instrument which measures the immersive response as experienced by the player in a single game session. The IMX Questionnaire was developed using data from the initial qualitative study and quantitative survey. Exploratory Factor Analysis was carried out on data from 300 participants for the IMX Version 1, and Confirmatory Factor Analysis was conducted on data from 380 participants on the IMX Version 2. IMX Version 3 was developed from the results of these analyses. This questionnaire was found to have high internal consistency reliability and validity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The topic of this thesis is impulsivity. The meaning and measurement of impulse control is explored, with a particular focus on forensic settings. Impulsivity is central to many areas of psychology; it is one of the most common diagnostic criteria of mental disorders and is fundamental to the understanding of forensic personalities. Despite this widespread importance there is little agreement as to the definition or structure of impulsivity, and its measurement is fraught with difficulty owing to a reliance on self-report methods. This research aims to address this problem by investigating the viability of using simple computerised cognitive performance tasks as complementary components of a multi-method assessment strategy for impulse control. Ultimately, the usefulness of this measurement strategy for a forensic sample is assessed. Impulsivity is found to be a multifaceted construct comprised of a constellation of distinct sub-dimensions. Computerised cognitive performance tasks are valid and reliable measures that can assess impulsivity at a neuronal level. Self-report and performance task methods assess distinct components of impulse control and, for the optimal assessment of impulse control, a multi-method battery of self-report and performance task measures is advocated. Such a battery is shown to have demonstrated utility in a forensic sample, and recommendations for forensic assessment in the Irish context are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The retrofitting of existing buildings for decreased energy usage, through increased energy efficiency and for minimum carbon dioxide emissions throughout their remaining lifetime is a major area of research. This research area requires development to provide building professionals with more efficient building retrofit solution determination tools. The overarching objective of this research is to develop a tool for this purpose through the implementation of a prescribed methodology. This has been achieved in three distinct steps. Firstly, the concept of using the degree-days modelling method as an adequate means of basing retrofit decision upon was analysed and the results illustrated that the concept had merit. Secondly, the concept of combining the degree-days modelling method and the Genetic Algorithms optimisation method is investigated as a method of determining optimal thermal energy retrofit solutions. Thirdly, the combination of the degree-days modelling method and the Genetic Algorithms optimisation method were packaged into a building retrofit decision-support tool and named BRaSS (Building Retrofit Support Software). The results demonstrate clearly that, fundamental building information, simplified occupancy profiles and weather data used in a static simulation modelling method is a sufficient and adequate means to base retrofitting decisions upon. The results also show that basing retrofit decisions upon energy analysis results are the best means to guide a retrofit project and also to achieve results which are optimum for a particular building. The results also indicate that the building retrofit decision-support tool, BRaSS, is an effective method to determine optimum thermal energy retrofit solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The influence of communication technology on group decision-making has been examined in many studies. But the findings are inconsistent. Some studies showed a positive effect on decision quality, other studies have shown that communication technology makes the decision even worse. One possible explanation for these different findings could be the use of different Group Decision Support Systems (GDSS) in these studies, with some GDSS better fitting to the given task than others and with different sets of functions. This paper outlines an approach with an information system solely designed to examine the effect of (1) anonymity, (2) voting and (3) blind picking on decision quality, discussion quality and perceived quality of information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: Strict lifelong compliance to a gluten-free diet (GFD) minimizes the long-term risk of mortality, especially from lymphoma, in adult celiac disease (CD). Although serum IgA antitransglutaminase (IgA-tTG-ab), like antiendomysium (IgA-EMA) antibodies, are sensitive and specific screening tests for untreated CD, their reliability as predictors of strict compliance to and dietary transgressions from a GFD is not precisely known. We aimed to address this question in consecutively treated adult celiacs. METHODS: In a cross-sectional study, 95 non-IgA deficient adult (median age: 41 yr) celiacs on a GFD for at least 1 yr (median: 6 yr) were subjected to 1) a dietician-administered inquiry to pinpoint and quantify the number and levels of transgressions (classified as moderate or large, using as a cutoff value the median gluten amount ingested in the overall noncompliant patients of the series) over the previous 2 months, 2) a search for IgA-tTG-ab and -EMA, and 3) perendoscopic duodenal biopsies. The ability of both antibodies to discriminate celiacs with and without detected transgressions was described using receiver operating characteristic curves and quantified as to sensitivity and specificity, according to the level of transgressions. RESULTS: Forty (42%) patients strictly adhered to a GFD, 55 (58%) had committed transgressions, classified as moderate (< or = 18 g of gluten/2 months; median number 6) in 27 and large (>18 g; median number 69) in 28. IgA-tTG-ab and -EMA specificity (proportion of correct recognition of strictly compliant celiacs) was 0.97 and 0.98, respectively, and sensitivity (proportion of correct recognition of overall, moderate, and large levels of transgressions) was 0.52, 0.31, and 0.77, and 0.62, 0.37, and 0.86, respectively. IgA-tTG-ab and -EMA titers were correlated (p < 0.001) to transgression levels (r = 0.560 and R = 0.631, respectively) and one to another (p < 0.001) in the whole patient population (r = 0.834, N = 84) as in the noncompliant (r = 0.915, N = 48) group. Specificity and sensitivity of IgA-tTG-ab and IgA-EMA for recognition of total villous atrophy in patients under a GFD were 0.90 and 0.91, and 0.60 and 0.73, respectively. CONCLUSIONS: In adult CD patients on a GFD, IgA-tTG-ab are poor predictors of dietary transgressions. Their negativity is a falsely secure marker of strict diet compliance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Timing-related defects are major contributors to test escapes and in-field reliability problems for very-deep submicrometer integrated circuits. Small delay variations induced by crosstalk, process variations, power-supply noise, as well as resistive opens and shorts can potentially cause timing failures in a design, thereby leading to quality and reliability concerns. We present a test-grading technique that uses the method of output deviations for screening small-delay defects (SDDs). A new gate-delay defect probability measure is defined to model delay variations for nanometer technologies. The proposed technique intelligently selects the best set of patterns for SDD detection from an n-detect pattern set generated using timing-unaware automatic test-pattern generation (ATPG). It offers significantly lower computational complexity and excites a larger number of long paths compared to a current generation commercial timing-aware ATPG tool. Our results also show that, for the same pattern count, the selected patterns provide more effective coverage ramp-up than timing-aware ATPG and a recent pattern-selection method for random SDDs potentially caused by resistive shorts, resistive opens, and process variations. © 2010 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measuring the entorhinal cortex (ERC) is challenging due to lateral border discrimination from the perirhinal cortex. From a sample of 39 nondemented older adults who completed volumetric image scans and verbal memory indices, we examined reliability and validity concerns for three ERC protocols with different lateral boundary guidelines (i.e., Goncharova, Dickerson, Stoub, & deToledo-Morrell, 2001; Honeycutt et al., 1998; Insausti et al., 1998). We used three novice raters to assess inter-rater reliability on a subset of scans (216 total ERCs), with the entire dataset measured by one rater with strong intra-rater reliability on each technique (234 total ERCs). We found moderate to strong inter-rater reliability for two techniques with consistent ERC lateral boundary endpoints (Goncharova, Honeycutt), with negligible to moderate reliability for the technique requiring consideration of collateral sulcal depth (Insausti). Left ERC and story memory associations were moderate and positive for two techniques designed to exclude the perirhinal cortex (Insausti, Goncharova), with the Insausti technique continuing to explain 10% of memory score variance after additionally controlling for depression symptom severity. Right ERC-story memory associations were nonexistent after excluding an outlier. Researchers are encouraged to consider challenges of rater training for ERC techniques and how lateral boundary endpoints may impact structure-function associations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. METHODS: Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. RESULTS: Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. CONCLUSIONS: This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials workflows.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gemstone Team ILL (Interactive Language Learning)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gemstone Team MICE (Modifying and Improving Computer Ergonomics)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The early detection of developmental disorders is key to child outcome, allowing interventions to be initiated which promote development and improve prognosis. Research on autism spectrum disorder (ASD) suggests that behavioral signs can be observed late in the first year of life. Many of these studies involve extensive frame-by-frame video observation and analysis of a child's natural behavior. Although nonintrusive, these methods are extremely time-intensive and require a high level of observer training; thus, they are burdensome for clinical and large population research purposes. This work is a first milestone in a long-term project on non-invasive early observation of children in order to aid in risk detection and research of neurodevelopmental disorders. We focus on providing low-cost computer vision tools to measure and identify ASD behavioral signs based on components of the Autism Observation Scale for Infants (AOSI). In particular, we develop algorithms to measure responses to general ASD risk assessment tasks and activities outlined by the AOSI which assess visual attention by tracking facial features. We show results, including comparisons with expert and nonexpert clinicians, which demonstrate that the proposed computer vision tools can capture critical behavioral observations and potentially augment the clinician's behavioral observations obtained from real in-clinic assessments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The early detection of developmental disorders is key to child outcome, allowing interventions to be initiated that promote development and improve prognosis. Research on autism spectrum disorder (ASD) suggests behavioral markers can be observed late in the first year of life. Many of these studies involved extensive frame-by-frame video observation and analysis of a child's natural behavior. Although non-intrusive, these methods are extremely time-intensive and require a high level of observer training; thus, they are impractical for clinical and large population research purposes. Diagnostic measures for ASD are available for infants but are only accurate when used by specialists experienced in early diagnosis. This work is a first milestone in a long-term multidisciplinary project that aims at helping clinicians and general practitioners accomplish this early detection/measurement task automatically. We focus on providing computer vision tools to measure and identify ASD behavioral markers based on components of the Autism Observation Scale for Infants (AOSI). In particular, we develop algorithms to measure three critical AOSI activities that assess visual attention. We augment these AOSI activities with an additional test that analyzes asymmetrical patterns in unsupported gait. The first set of algorithms involves assessing head motion by tracking facial features, while the gait analysis relies on joint foreground segmentation and 2D body pose estimation in video. We show results that provide insightful knowledge to augment the clinician's behavioral observations obtained from real in-clinic assessments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gemstone Team FACE