349 resultados para Query errors


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Driving is a vigilance task, requiring sustained attention to maintain performance and avoid crashes. Hypovigilance (i.e., marked reduction in vigilance) while driving manifests as poor driving performance and is commonly attributed to fatigue (Dinges, 1995). However, poor driving performance has been found to be more frequent when driving in monotonous road environments, suggesting that monotony plays a role in generating hypovigilance (Thiffault & Bergeron, 2003b). Research to date has tended to conceptualise monotony as a uni-dimensional task characteristic, typically used over a prolonged period of time to facilitate other factors under investigation, most notably fatigue. However, more often than not, more than one exogenous factor relating to the task or operating environment is manipulated to vary or generate monotony (Mascord & Heath, 1992). Here we aimed to explore whether monotony is a multi-dimensional construct that is determined by characteristics of both the task proper and the task environment. The general assumption that monotony is a task characteristic used solely to elicit hypovigilance or poor performance related to fatigue appears to have led to there being little rigorous investigation into the exact nature of the relationship. While the two concepts are undoubtedly linked, the independent effect of monotony on hypovigilance remains largely ignored. Notwithstanding, there is evidence that monotony effects can emerge very early in vigilance tasks and are not necessarily accompanied by fatigue (see Meuter, Rakotonirainy, Johns, & Wagner, 2005). This phenomenon raises a largely untested, empirical question explored in two studies: Can hypovigilance emerge as a consequence of task and/or environmental monotony, independent of time on task and fatigue? In Study 1, using a short computerised vigilance task requiring responses to be withheld to infrequent targets, we explored the differential impacts of stimuli and task demand manipulations on the development of a monotonous context and the associated effects on vigilance performance (as indexed by respone errors and response times), independent of fatigue and time on task. The role of individual differences (sensation seeking, extroversion and cognitive failures) in moderating monotony effects was also considered. The results indicate that monotony affects sustained attention, with hypovigilance and associated performance worse in monotonous than in non-monotonous contexts. Critically, performance decrements emerged early in the task (within 4.3 minutes) and remained consistent over the course of the experiment (21.5 minutes), suggesting that monotony effects can operate independent of time on task and fatigue. A combination of low task demands and low stimulus variability form a monotonous context characterised by hypovigilance and poor task performance. Variations to task demand and stimulus variability were also found to independently affect performance, suggesting that monotony is a multi-dimensional construct relating to both task monotony (associated with the task itself) and environmental monotony (related to characteristics of the stimulus). Consequently, it can be concluded that monotony is multi-dimensional and is characterised by low variability in stimuli and/or task demands. The proposition that individual differences emerge under conditions of varying monotony with high sensation seekers and/or extroverts performing worse in monotonous contexts was only partially supported. Using a driving simulator, the findings of Study 1 were extended to a driving context to identify the behavioural and psychophysiological indices of monotony-related hypovigilance associated with variations to road design and road side scenery (Study 2). Supporting the proposition that monotony is a multi-dimensional construct, road design variability emerged as a key moderating characteristic of environmental monotony, resulting in poor driving performance indexed by decrements in steering wheel measures (mean lateral position). Sensation seeking also emerged as a moderating factor, where participants high in sensation seeking tendencies displayed worse driving behaviour in monotonous conditions. Importantly, impaired driving performance was observed within 8 minutes of commencing the driving task characterised by environmental monotony (low variability in road design) and was not accompanied by a decline in psychophysiological arousal. In addition, no subjective declines in alertness were reported. With fatigue effects associated with prolonged driving (van der Hulst, Meijman, & Rothengatter, 2001) and indexed by drowsiness, this pattern of results indicates that monotony can affect driver vigilance, independent of time on task and fatigue. Perceptual load theory (Lavie, 1995, 2005) and mindlessness theory (Robertson, Manly, Andrade, Baddley, & Yiend, 1997) provide useful theoretical frameworks for explaining and predicting monotony effects by positing that the low load (of task and/or stimuli) associated with a monotonous task results in spare attentional capacity which spills over involuntarily, resulting in the processing of task-irrelevant stimuli or task unrelated thoughts. That is, individuals – even when not fatigued - become easily distracted when performing a highly monotonous task, resulting in hypovigilance and impaired performance. The implications for road safety, including the likely effectiveness of fatigue countermeasures to mitigate monotony-related driver hypovigilance are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose a search-based approach to join two tables in the absence of clean join attributes. Non-structured documents from the web are used to express the correlations between a given query and a reference list. To implement this approach, a major challenge we meet is how to efficiently determine the number of times and the locations of each clean reference from the reference list that is approximately mentioned in the retrieved documents. We formalize the Approximate Membership Localization (AML) problem and propose an efficient partial pruning algorithm to solve it. A study using real-word data sets demonstrates the effectiveness of our search-based approach, and the efficiency of our AML algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Unusual event detection in crowded scenes remains challenging because of the diversity of events and noise. In this paper, we present a novel approach for unusual event detection via sparse reconstruction of dynamic textures over an overcomplete basis set, with the dynamic texture described by local binary patterns from three orthogonal planes (LBPTOP). The overcomplete basis set is learnt from the training data where only the normal items observed. In the detection process, given a new observation, we compute the sparse coefficients using the Dantzig Selector algorithm which was proposed in the literature of compressed sensing. Then the reconstruction errors are computed, based on which we detect the abnormal items. Our application can be used to detect both local and global abnormal events. We evaluate our algorithm on UCSD Abnormality Datasets for local anomaly detection, which is shown to outperform current state-of-the-art approaches, and we also get promising results for rapid escape detection using the PETS2009 dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many applications, e.g., bioinformatics, web access traces, system utilisation logs, etc., the data is naturally in the form of sequences. People have taken great interest in analysing the sequential data and finding the inherent characteristics or relationships within the data. Sequential association rule mining is one of the possible methods used to analyse this data. As conventional sequential association rule mining very often generates a huge number of association rules, of which many are redundant, it is desirable to find a solution to get rid of those unnecessary association rules. Because of the complexity and temporal ordered characteristics of sequential data, current research on sequential association rule mining is limited. Although several sequential association rule prediction models using either sequence constraints or temporal constraints have been proposed, none of them considered the redundancy problem in rule mining. The main contribution of this research is to propose a non-redundant association rule mining method based on closed frequent sequences and minimal sequential generators. We also give a definition for the non-redundant sequential rules, which are sequential rules with minimal antecedents but maximal consequents. A new algorithm called CSGM (closed sequential and generator mining) for generating closed sequences and minimal sequential generators is also introduced. A further experiment has been done to compare the performance of generating non-redundant sequential rules and full sequential rules, meanwhile, performance evaluation of our CSGM and other closed sequential pattern mining or generator mining algorithms has also been conducted. We also use generated non-redundant sequential rules for query expansion in order to improve recommendations for infrequently purchased products.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transmission smart grids will use a digital platform for the automation of high voltage substations. The IEC 61850 series of standards, released in parts over the last ten years, provide a specification for substation communications networks and systems. These standards, along with IEEE Std 1588-2008 Precision Time Protocol version 2 (PTPv2) for precision timing, are recommended by the both IEC Smart Grid Strategy Group and the NIST Framework and Roadmap for Smart Grid Interoperability Standards for substation automation. IEC 61850, PTPv2 and Ethernet are three complementary protocol families that together define the future of sampled value digital process connections for smart substation automation. A time synchronisation system is required for a sampled value process bus, however the details are not defined in IEC 61850-9-2. PTPv2 provides the greatest accuracy of network based time transfer systems, with timing errors of less than 100 ns achievable. The suitability of PTPv2 to synchronise sampling in a digital process bus is evaluated, with preliminary results indicating that steady state performance of low cost clocks is an acceptable ±300 ns, but that corrections issued by grandmaster clocks can introduce significant transients. Extremely stable grandmaster oscillators are required to ensure any corrections are sufficiently small that time synchronising performance is not degraded.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Periprosthetic fractures are increasingly frequent. The fracture may be located over the shaft of the prosthesis, at its tip or below (21). The treatment of explosion fractures is difficult because the shaft blocks the application of implants, like screws, which need to penetrate the medullary cavity. The cerclage, as a simple periosteal loop, made of wire or more recently cable, does not only avoid the medullary cavity. Its centripetal mode of action is well suited for reducing and maintaining radially displaced fractures. Furthermore, the cerclage lends itself well for minimally invasive internal fixation. New insight challenges the disrepute of which the cerclage technology suffered for decades. The outcome of cerclage fixation benefits from an improved understanding of its technology, mechano-biology and periosteal blood supply. Preconceived and generally accepted opinions like "strangulation of blood supply" need to be re-examined. Recent mechanical evaluations (22) demonstrate that the wire application may be improved but cable is superior in hand- ling, maintenance of tension and strength. Beside the classical concepts of absolute and relative stability a defined stability condition needs consideration. It is typical for cerclage. Called "loose-lock stability" it specifies the situation where a loosened implant allows first unimpeded displacement changing abruptly into a locked fixation preventing further dislocation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a method for the recovery of position and absolute attitude (including pitch, roll and yaw) using a novel fusion of monocular Visual Odometry and GPS measurements in a similar manner to a classic loosely-coupled GPS/INS error state navigation filter. The proposed filter does not require additional restrictions or assumptions such as platform-specific dynamics, map-matching, feature-tracking, visual loop-closing, gravity vector or additional sensors such as an IMU or magnetic compass. An observability analysis of the proposed filter is performed, showing that the scale factor, position and attitude errors are fully observable under acceleration that is non-parallel to velocity vector in the navigation frame. The observability properties of the proposed filter are demonstrated using numerical simulations. We conclude the article with an implementation of the proposed filter using real flight data collected from a Cessna 172 equipped with a downwards-looking camera and GPS, showing the feasibility of the algorithm in real-world conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We measured wave aberrations over the central 42° x 32° visual field for a 5 mm pupil for groups of 10 emmetropic (mean spherical equivalent 0.11 ± 0.50 D) and 9 myopic (MSE -3.67 ± 1.91 D) young adults. Relative peripheral refractive errors over the measured field were generally myopic in both groups. Mean values of were almost constant across the measured field and were more positive in emmetropes (+0.023 ± 0.043 microns) than in myopes (-0.007 ± 0.045 microns). Coma varied more rapidly with field angle in myopes: modeling suggested that this difference reflected the differences in mean anterior corneal shape and axial length in the two groups. In general however, overall levels of RMS aberration differed only modestly between the two groups, implying that it is unlikely that high levels of aberration contribute to myopia development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigated how the interpretation of mathematical problems by Year 7 students impacted on their ability to demonstrate what they can do in NAPLAN numeracy testing. In the study, mathematics is viewed as a culturally and socially determined system of signs and signifiers that establish the meaning, origins and importance of mathematics. The study hypothesises that students are unable to succeed in NAPLAN numeracy tests because they cannot interpret the questions, even though they may be able to perform the necessary calculations. To investigate this, the study applied contemporary theories of literacy to the context of mathematical problem solving. A case study design with multiple methods was used. The study used a correlation design to explore the connections between NAPLAN literacy and numeracy outcomes of 198 Year 7 students in a Queensland school. Additionally, qualitative methods provided a rich description of the effect of the various forms of NAPLAN numeracy questions on the success of ten Year 7 students in the same school. The study argues that there is a quantitative link between reading and numeracy. It illustrates that interpretation (literacy) errors are the most common error type in the selected NAPLAN questions, made by students of all abilities. In contrast, conceptual (mathematical) errors are less frequent amongst more capable students. This has important implications in preparing students for NAPLAN numeracy tests. The study concluded by recommending that increased focus on the literacies of mathematics would be effective in improving NAPLAN results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an experiment designed to investigate if redundancy in an interface has any impact on the use of complex interfaces by older people and people with low prior-experience with technology. The important findings of this study were that older people (65+ years) completed the tasks on the Words only based interface faster than on Redundant (text and symbols) interface. The rest of the participants completed tasks significantly faster on the Redundant interface. From a cognitive processing perspective, sustained attention (one of the functions of Central Executive) has emerged as one of the important factors in completing tasks on complex interfaces faster and with fewer of errors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research objectives of this thesis were to contribute to Bayesian statistical methodology by contributing to risk assessment statistical methodology, and to spatial and spatio-temporal methodology, by modelling error structures using complex hierarchical models. Specifically, I hoped to consider two applied areas, and use these applications as a springboard for developing new statistical methods as well as undertaking analyses which might give answers to particular applied questions. Thus, this thesis considers a series of models, firstly in the context of risk assessments for recycled water, and secondly in the context of water usage by crops. The research objective was to model error structures using hierarchical models in two problems, namely risk assessment analyses for wastewater, and secondly, in a four dimensional dataset, assessing differences between cropping systems over time and over three spatial dimensions. The aim was to use the simplicity and insight afforded by Bayesian networks to develop appropriate models for risk scenarios, and again to use Bayesian hierarchical models to explore the necessarily complex modelling of four dimensional agricultural data. The specific objectives of the research were to develop a method for the calculation of credible intervals for the point estimates of Bayesian networks; to develop a model structure to incorporate all the experimental uncertainty associated with various constants thereby allowing the calculation of more credible credible intervals for a risk assessment; to model a single day’s data from the agricultural dataset which satisfactorily captured the complexities of the data; to build a model for several days’ data, in order to consider how the full data might be modelled; and finally to build a model for the full four dimensional dataset and to consider the timevarying nature of the contrast of interest, having satisfactorily accounted for possible spatial and temporal autocorrelations. This work forms five papers, two of which have been published, with two submitted, and the final paper still in draft. The first two objectives were met by recasting the risk assessments as directed, acyclic graphs (DAGs). In the first case, we elicited uncertainty for the conditional probabilities needed by the Bayesian net, incorporated these into a corresponding DAG, and used Markov chain Monte Carlo (MCMC) to find credible intervals, for all the scenarios and outcomes of interest. In the second case, we incorporated the experimental data underlying the risk assessment constants into the DAG, and also treated some of that data as needing to be modelled as an ‘errors-invariables’ problem [Fuller, 1987]. This illustrated a simple method for the incorporation of experimental error into risk assessments. In considering one day of the three-dimensional agricultural data, it became clear that geostatistical models or conditional autoregressive (CAR) models over the three dimensions were not the best way to approach the data. Instead CAR models are used with neighbours only in the same depth layer. This gave flexibility to the model, allowing both the spatially structured and non-structured variances to differ at all depths. We call this model the CAR layered model. Given the experimental design, the fixed part of the model could have been modelled as a set of means by treatment and by depth, but doing so allows little insight into how the treatment effects vary with depth. Hence, a number of essentially non-parametric approaches were taken to see the effects of depth on treatment, with the model of choice incorporating an errors-in-variables approach for depth in addition to a non-parametric smooth. The statistical contribution here was the introduction of the CAR layered model, the applied contribution the analysis of moisture over depth and estimation of the contrast of interest together with its credible intervals. These models were fitted using WinBUGS [Lunn et al., 2000]. The work in the fifth paper deals with the fact that with large datasets, the use of WinBUGS becomes more problematic because of its highly correlated term by term updating. In this work, we introduce a Gibbs sampler with block updating for the CAR layered model. The Gibbs sampler was implemented by Chris Strickland using pyMCMC [Strickland, 2010]. This framework is then used to consider five days data, and we show that moisture in the soil for all the various treatments reaches levels particular to each treatment at a depth of 200 cm and thereafter stays constant, albeit with increasing variances with depth. In an analysis across three spatial dimensions and across time, there are many interactions of time and the spatial dimensions to be considered. Hence, we chose to use a daily model and to repeat the analysis at all time points, effectively creating an interaction model of time by the daily model. Such an approach allows great flexibility. However, this approach does not allow insight into the way in which the parameter of interest varies over time. Hence, a two-stage approach was also used, with estimates from the first-stage being analysed as a set of time series. We see this spatio-temporal interaction model as being a useful approach to data measured across three spatial dimensions and time, since it does not assume additivity of the random spatial or temporal effects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intuitively, any ‘bag of words’ approach in IR should benefit from taking term dependencies into account. Unfortunately, for years the results of exploiting such dependencies have been mixed or inconclusive. To improve the situation, this paper shows how the natural language properties of the target documents can be used to transform and enrich the term dependencies to more useful statistics. This is done in three steps. The term co-occurrence statistics of queries and documents are each represented by a Markov chain. The paper proves that such a chain is ergodic, and therefore its asymptotic behavior is unique, stationary, and independent of the initial state. Next, the stationary distribution is taken to model queries and documents, rather than their initial distributions. Finally, ranking is achieved following the customary language modeling paradigm. The main contribution of this paper is to argue why the asymptotic behavior of the document model is a better representation then just the document’s initial distribution. A secondary contribution is to investigate the practical application of this representation in case the queries become increasingly verbose. In the experiments (based on Lemur’s search engine substrate) the default query model was replaced by the stable distribution of the query. Just modeling the query this way already resulted in significant improvements over a standard language model baseline. The results were on a par or better than more sophisticated algorithms that use fine-tuned parameters or extensive training. Moreover, the more verbose the query, the more effective the approach seems to become.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Few studies have specifically investigated the functional effects of uncorrected astigmatism on measures of reading fluency. This information is important to provide evidence for the development of clinical guidelines for the correction of astigmatism. Methods: Participants included 30 visually normal, young adults (mean age 21.7 ± 3.4 years). Distance and near visual acuity and reading fluency were assessed with optimal spectacle correction (baseline) and for two levels of astigmatism, 1.00DC and 2.00DC, at two axes (90° and 180°) to induce both against-the-rule (ATR) and with-the-rule (WTR) astigmatism. Reading and eye movement fluency were assessed using standardized clinical measures including the test of Discrete Reading Rate (DRR), the Developmental Eye Movement (DEM) test and by recording eye movement patterns with the Visagraph (III) during reading for comprehension. Results: Both distance and near acuity were significantly decreased compared to baseline for all of the astigmatic lens conditions (p < 0.001). Reading speed with the DRR for N16 print size was significantly reduced for the 2.00DC ATR condition (a reduction of 10%), while for smaller text sizes reading speed was reduced by up to 24% for the 1.00DC ATR and 2.00DC condition in both axis directions (p<0.05). For the DEM, sub-test completion speeds were significantly impaired, with the 2.00DC condition affecting both vertical and horizontal times and the 1.00DC ATR condition affecting only horizontal times (p<0.05). Visagraph reading eye movements were not significantly affected by the induced astigmatism. Conclusions: Induced astigmatism impaired performance on selected tests of reading fluency, with ATR astigmatism having significantly greater effects on performance than did WTR, even for relatively small amounts of astigmatic blur of 1.00DC. These findings have implications for the minimal prescribing criteria for astigmatic refractive errors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Building an efficient and an effective search engine is a very challenging task. In this paper, we present the efficiency and effectiveness of our search engine at the INEX 2009 Efficiency and Ad Hoc Tracks. We have developed a simple and effective pruning method for fast query evaluation, and used a two-step process for Ad Hoc retrieval. The overall results from both tracks show that our search engine performs very competitively in terms of both efficiency and effectiveness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As business process management technology matures, organisations acquire more and more business process models. The resulting collections can consist of hundreds, even thousands of models and their management poses real challenges. One of these challenges concerns model retrieval where support should be provided for the formulation and efficient execution of business process model queries. As queries based on only structural information cannot deal with all querying requirements in practice, there should be support for queries that require knowledge of process model semantics. In this paper we formally define a process model query language that is based on semantic relationships between tasks. This query language is independent of the particular process modelling notation used, but we will demonstrate how it can be used in the context of Petri nets by showing how the semantic relationships can be determined for these nets in such a way that state space explosion is avoided as much as possible. An experiment with three large process model repositories shows that queries expressed in our language can be evaluated efficiently.