64 resultados para Coarse-to-fine processing

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In competitive combat sporting environments like boxing, the statistics on a boxer's performance, including the amount and type of punches thrown, provide a valuable source of data and feedback which is routinely used for coaching and performance improvement purposes. This paper presents a robust framework for the automatic classification of a boxer's punches. Overhead depth imagery is employed to alleviate challenges associated with occlusions, and robust body-part tracking is developed for the noisy time-of-flight sensors. Punch recognition is addressed through both a multi-class SVM and Random Forest classifiers. A coarse-to-fine hierarchical SVM classifier is presented based on prior knowledge of boxing punches. This framework has been applied to shadow boxing image sequences taken at the Australian Institute of Sport with 8 elite boxers. Results demonstrate the effectiveness of the proposed approach, with the hierarchical SVM classifier yielding a 96% accuracy, signifying its suitability for analysing athletes punches in boxing bouts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background When observers are asked to identify two targets in rapid sequence, they often suffer profound performance deficits for the second target, even when the spatial location of the targets is known. This attentional blink (AB) is usually attributed to the time required to process a previous target, implying that a link should exist between individual differences in information processing speed and the AB. Methodology/Principal Findings The present work investigated this question by examining the relationship between a rapid automatized naming task typically used to assess information-processing speed and the magnitude of the AB. The results indicated that faster processing actually resulted in a greater AB, but only when targets were presented amongst high similarity distractors. When target-distractor similarity was minimal, processing speed was unrelated to the AB. Conclusions/Significance Our findings indicate that information-processing speed is unrelated to target processing efficiency per se, but rather to individual differences in observers' ability to suppress distractors. This is consistent with evidence that individuals who are able to avoid distraction are more efficient at deploying temporal attention, but argues against a direct link between general processing speed and efficient information selection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exposures to traffic-related air pollution (TRAP) can be particularly high in transport microenvironments (i.e. in and around vehicles) despite the short durations typically spent there. There is a mounting body of evidence that suggests that this is especially true for fine (b2.5 μm) and ultrafine (b100 nm, UF) particles. Professional drivers, who spend extended periods of time in transport microenvironments due to their job, may incur exposures markedly higher than already elevated non-occupational exposures. Numerous epidemiological studies have shown a raised incidence of adverse health outcomes among professional drivers, and exposure to TRAP has been suggested as one of the possible causal factors. Despite this, data describing the range and determinants of occupational exposures to fine and UF particles are largely conspicuous in their absence. Such information could strengthen attempts to define the aetiology of professional drivers' illnesses as it relates to traffic combustion-derived particles. In this article, we suggest that the drivers' occupational fine and UF particle exposures are an exemplar case where opportunities exist to better link exposure science and epidemiology in addressing questions of causality. The nature of the hazard is first introduced, followed by an overview of the health effects attributable to exposures typical of transport microenvironments. Basic determinants of exposure and reduction strategies are also described, and finally the state of knowledge is briefly summarised along with an outline of the main unanswered questions in the topic area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an experimental study to evaluate the effect of coarse and fine LWA in concrete on its water absorption and permeability, and resistance to chloride-ion penetration. In additions, LWC with lower unit weight of about 1300 kg/m3 but high resistance to water and chloride-ion penetration was developed and evaluated. The results indicate that the incorporation of coarse LWA in concrete increases water sorptivity and permeability slightly compared to NWC of similar w/c. The resistance of the sand-LWC to chloride-ion penetration depends on porosity of the coarse LWA. Fine LWA has more influence on the transport proper-ties of concrete than coarse LWA. Use of lightweight crushed sand <1.18 mm reduced the resistance of the LWC to water and chloride-ion penetration to some extent. With low w/cm and silica fume, low unit weight LWC (~1300 kg/m3) was produced with higher resistance to water and chloride ion penetration compared with concretes of higher unit weights.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose The role of fine lactose in the dispersion of salmeterol xinafoate (SX) from lactose mixtures was studied by modifying the fine lactose concentration on the surface of the lactose carriers using wet decantation. Methods Fine lactose was removed from lactose carriers by wet decantation using ethanol saturated with lactose. Particle sizing was achieved by laser diffraction. Fine particle fractions (FPFs) were determined by Twin Stage Impinger using a 2.5% SX mixture, and SX was analyzed by a validated high-performance liquid chromatography method. Adhesion forces between probes of SX and silica and the lactose surfaces were determined by atomic force microscopy. Results FPFs of SX were related to fine lactose concentration in the mixture for inhalation grade lactose samples. Reductions in FPF (2-4-fold) of Aeroflo 95 and 65 were observed after removing fine lactose by wet decantation; FPFs reverted to original values after addition of micronized lactose to decanted mixtures. FPFs of SX of sieved and decanted fractions of Aeroflo carriers were significantly different (p < 0.001). The relationship between FPF and fine lactose concentration was linear. Decanted lactose demonstrated surface modification through increased SX-lactose adhesion forces; however, any surface modification other than removal of fine lactose only slightly influenced FPF. Conclusions Fine lactose played a key and dominating role in controlling FPF. SX to fine lactose ratios influenced dispersion of SX with maximum dispersion occurring as the ratio approached unity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SoundCipher is a software library written in the Java language that adds important music and sound features to the Processing environment that is widely used by media artists and otherwise has an orientation toward computational graphics. This article introduces the SoundCipher library and its features, describes its influences and design intentions, and positions it within the field of computer music programming tools. SoundCipher enables the rich history of algorithmic music techniques to be accessible within one of today’s most popular media art platforms. It also provides an accessible means for learning to create algorithmic music and sound programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While spatial determinants of emmetropization have been examined extensively in animal models and spatial processing of human myopes has also been studied, there have been few studies investigating temporal aspects of emmetropization and temporal processing in human myopia. The influence of temporal light modulation on eye growth and refractive compensation has been observed in animal models and there is evidence of temporal visual processing deficits in individuals with high myopia or other pathologies. Given this, the aims of this work were to examine the relationships between myopia (i.e. degree of myopia and progression status) and temporal visual performance and to consider any temporal processing deficits in terms of the parallel retinocortical pathways. Three psychophysical studies investigating temporal processing performance were conducted in young adult myopes and non-myopes: (1) backward visual masking, (2) dot motion perception and (3) phantom contour. For each experiment there were approximately 30 young emmetropes, 30 low myopes (myopia less than 5 D) and 30 high myopes (5 to 12 D). In the backward visual masking experiment, myopes were also classified according to their progression status (30 stable myopes and 30 progressing myopes). The first study was based on the observation that the visibility of a target is reduced by a second target, termed the mask, presented quickly after the first target. Myopes were more affected by the mask when the task was biased towards the magnocellular pathway; myopes had a 25% mean reduction in performance compared with emmetropes. However, there was no difference in the effect of the mask when the task was biased towards the parvocellular system. For all test conditions, there was no significant correlation between backward visual masking task performance and either the degree of myopia or myopia progression status. The dot motion perception study measured detection thresholds for the minimum displacement of moving dots, the maximum displacement of moving dots and degree of motion coherence required to correctly determine the direction of motion. The visual processing of these tasks is dominated by the magnocellular pathway. Compared with emmetropes, high myopes had reduced ability to detect the minimum displacement of moving dots for stimuli presented at the fovea (20% higher mean threshold) and possibly at the inferior nasal retina. The minimum displacement threshold was significantly and positively correlated to myopia magnitude and axial length, and significantly and negatively correlated with retinal thickness for the inferior nasal retina. The performance of emmetropes and myopes for all the other dot motion perception tasks were similar. In the phantom contour study, the highest temporal frequency of the flickering phantom pattern at which the contour was visible was determined. Myopes had significantly lower flicker detection limits (21.8 ± 7.1 Hz) than emmetropes (25.6 ± 8.8 Hz) for tasks biased towards the magnocellular pathway for both high (99%) and low (5%) contrast stimuli. There was no difference in flicker limits for a phantom contour task biased towards the parvocellular pathway. For all phantom contour tasks, there was no significant correlation between flicker detection thresholds and magnitude of myopia. Of the psychophysical temporal tasks studied here those primarily involving processing by the magnocellular pathway revealed differences in performance of the refractive error groups. While there are a number of interpretations for this data, this suggests that there may be a temporal processing deficit in some myopes that is selective for the magnocellular system. The minimum displacement dot motion perception task appears the most sensitive test, of those studied, for investigating changes in visual temporal processing in myopia. Data from the visual masking and phantom contour tasks suggest that the alterations to temporal processing occur at an early stage of myopia development. In addition, the link between increased minimum displacement threshold and decreasing retinal thickness suggests that there is a retinal component to the observed modifications in temporal processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multiresolution techniques are being extensively used in signal processing literature. This paper has two parts, in the first part we derive a relationship between the general degradation model (Y=BX+W) at coarse and fine resolutions. In the second part we develop a signal restoration scheme in a multiresolution framework and demonstrate through experiments that the knowledge of the relationship between the degradation model at different resolutions helps in obtaining computationally efficient restoration scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous research on the protection of soil organic C from decomposition suggests that soil texture affects soil C stocks. However, different pools of soil organic matter (SOM) might be differently related to soil texture. Our objective was to examine how soil texture differentially alters the distribution of organic C within physically and chemically defined pools of unprotected and protected SOM. We collected samples from two soil texture gradients where other variables influencing soil organic C content were held constant. One texture gradient (16-60% clay) was located near Stewart Valley, Saskatchewan, Canada and the other (25-50% clay) near Cygnet, OH. Soils were physically fractionated into coarse- and fine-particulate organic matter (POM), silt- and clay-sized particles within microaggregates, and easily dispersed silt-and clay-sized particles outside of microaggregates. Whole-soil organic C concentration was positively related to silt plus clay content at both sites. We found no relationship between soil texture and unprotected C (coarse- and fine-POM C). Biochemically protected C (nonhydrolyzable C) increased with increasing clay content in whole-soil samples, but the proportion of nonhydrolyzable C within silt- and clay-sized fractions was unchanged. As the amount of silt or clay increased, the amount of C stabilized within easily dispersed and microaggregate-associated silt or clay fractions decreased. Our results suggest that for a given level of C inputs, the relationship between mineral surface area and soil organic matter varies with soil texture for physically and biochemically protected C fractions. Because soil texture acts directly and indirectly on various protection mechanisms, it may not be a universal predictor of whole-soil C content.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automobiles have deeply impacted the way in which we travel but they have also contributed to many deaths and injury due to crashes. A number of reasons for these crashes have been pointed out by researchers. Inexperience has been identified as a contributing factor to road crashes. Driver’s driving abilities also play a vital role in judging the road environment and reacting in-time to avoid any possible collision. Therefore driver’s perceptual and motor skills remain the key factors impacting on road safety. Our failure to understand what is really important for learners, in terms of competent driving, is one of the many challenges for building better training programs. Driver training is one of the interventions aimed at decreasing the number of crashes that involve young drivers. Currently, there is a need to develop comprehensive driver evaluation system that benefits from the advances in Driver Assistance Systems. A multidisciplinary approach is necessary to explain how driving abilities evolves with on-road driving experience. To our knowledge, driver assistance systems have never been comprehensively used in a driver training context to assess the safety aspect of driving. The aim and novelty of this thesis is to develop and evaluate an Intelligent Driver Training System (IDTS) as an automated assessment tool that will help drivers and their trainers to comprehensively view complex driving manoeuvres and potentially provide effective feedback by post processing the data recorded during driving. This system is designed to help driver trainers to accurately evaluate driver performance and has the potential to provide valuable feedback to the drivers. Since driving is dependent on fuzzy inputs from the driver (i.e. approximate distance calculation from the other vehicles, approximate assumption of the other vehicle speed), it is necessary that the evaluation system is based on criteria and rules that handles uncertain and fuzzy characteristics of the driving tasks. Therefore, the proposed IDTS utilizes fuzzy set theory for the assessment of driver performance. The proposed research program focuses on integrating the multi-sensory information acquired from the vehicle, driver and environment to assess driving competencies. After information acquisition, the current research focuses on automated segmentation of the selected manoeuvres from the driving scenario. This leads to the creation of a model that determines a “competency” criterion through the driving performance protocol used by driver trainers (i.e. expert knowledge) to assess drivers. This is achieved by comprehensively evaluating and assessing the data stream acquired from multiple in-vehicle sensors using fuzzy rules and classifying the driving manoeuvres (i.e. overtake, lane change, T-crossing and turn) between low and high competency. The fuzzy rules use parameters such as following distance, gaze depth and scan area, distance with respect to lanes and excessive acceleration or braking during the manoeuvres to assess competency. These rules that identify driving competency were initially designed with the help of expert’s knowledge (i.e. driver trainers). In-order to fine tune these rules and the parameters that define these rules, a driving experiment was conducted to identify the empirical differences between novice and experienced drivers. The results from the driving experiment indicated that significant differences existed between novice and experienced driver, in terms of their gaze pattern and duration, speed, stop time at the T-crossing, lane keeping and the time spent in lanes while performing the selected manoeuvres. These differences were used to refine the fuzzy membership functions and rules that govern the assessments of the driving tasks. Next, this research focused on providing an integrated visual assessment interface to both driver trainers and their trainees. By providing a rich set of interactive graphical interfaces, displaying information about the driving tasks, Intelligent Driver Training System (IDTS) visualisation module has the potential to give empirical feedback to its users. Lastly, the validation of the IDTS system’s assessment was conducted by comparing IDTS objective assessments, for the driving experiment, with the subjective assessments of the driver trainers for particular manoeuvres. Results show that not only IDTS was able to match the subjective assessments made by driver trainers during the driving experiment but also identified some additional driving manoeuvres performed in low competency that were not identified by the driver trainers due to increased mental workload of trainers when assessing multiple variables that constitute driving. The validation of IDTS emphasized the need for an automated assessment tool that can segment the manoeuvres from the driving scenario, further investigate the variables within that manoeuvre to determine the manoeuvre’s competency and provide integrated visualisation regarding the manoeuvre to its users (i.e. trainers and trainees). Through analysis and validation it was shown that IDTS is a useful assistance tool for driver trainers to empirically assess and potentially provide feedback regarding the manoeuvres undertaken by the drivers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study used ERPs to compare processing of fear-relevant (FR) animals (snakes and spiders) and non-fear-relevant (NFR) animals similar in appearance (worms and beetles). EEG was recorded from 18 undergraduate participants (10 females) as they completed two animal-viewing tasks that required simple categorization decisions. Participants were divided on a post hoc basis into low snake/spider fear and high snake/spider fear groups. Overall, FR animals were rated higher on fear and elicited a larger LPC. However, individual differences qualified these effects. Participants in the low fear group showed clear differentiation between FR and NFR animals on subjective ratings of fear and LPC modulation. In contrast, participants in the high fear group did not show such differentiation between FR and NFR animals. These findings suggest that the salience of feared-FR animals may generalize on both a behavioural and electro-cortical level to other animals of similar appearance but of a non-harmful nature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multiple sclerosis (MS) is a common chronic inflammatory disease of the central nervous system. Susceptibility to the disease is affected by both environmental and genetic factors. Genetic factors include haplotypes in the histocompatibility complex (MHC) and over 50 non-MHC loci reported by genome-wide association studies. Amongst these, we previously reported polymorphisms in chromosome 12q13-14 with a protective effect in individuals of European descent. This locus spans 288 kb and contains 17 genes, including several candidate genes which have potentially significant pathogenic and therapeutic implications. In this study, we aimed to fine-map this locus. We have implemented a two-phase study: a variant discovery phase where we have used next-generation sequencing and two target-enrichment strategies [long-range polymerase chain reaction (PCR) and Nimblegen's solution phase hybridization capture] in pools of 25 samples; and a genotyping phase where we genotyped 712 variants in 3577 healthy controls and 3269 MS patients. This study confirmed the association (rs2069502, P = 9.9 × 10−11, OR = 0.787) and narrowed down the locus of association to an 86.5 kb region. Although the study was unable to pinpoint the key-associated variant, we have identified a 42 (genotyped and imputed) single-nucleotide polymorphism haplotype block likely to harbour the causal variant. No evidence of association at previously reported low-frequency variants in CYP27B1 was observed. As part of the study we compared variant discovery performance using two target-enrichment strategies. We concluded that our pools enriched with Nimblegen's solution phase hybridization capture had better sensitivity to detect true variants than the pools enriched with long-range PCR, whilst specificity was better in the long-range PCR-enriched pools compared with solution phase hybridization capture enriched pools; this result has important implications for the design of future fine-mapping studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The growth of APIs and Web services on the Internet, especially through larger enterprise systems increasingly being leveraged for Cloud and software-as-a-service opportunities, poses challenges for improving the efficiency of integration with these services. Interfaces of enterprise systems are typically larger, more complex and overloaded, with single operations having multiple data entities and parameter sets, supporting varying requests, and reflecting versioning across different system releases, compared to fine-grained operations of contemporary interfaces. We propose a technique to support the refactoring of service interfaces by deriving business entities and their relationships. In this paper, we focus on the behavioural aspects of service interfaces, aiming to discover the sequential dependencies of operations (otherwise known as protocol extraction) based on the entities and relationships derived. Specifically, we propose heuristics according to these relationships, and in turn, deriving permissible orders in which operations are invoked. As a result of this, service operations can be refactored on business entity CRUD lines, with explicit behavioural protocols as part of an interface definition. This supports flexible service discovery, composition and integration. A prototypical implementation and analysis of existing Web services, including those of commercial logistic systems (Fedex), are used to validate the algorithms proposed through the paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To illustrate a new method for simplifying patient recruitment for advanced prostate cancer clinical trials using natural language processing techniques. Background: The identification of eligible participants for clinical trials is a critical factor to increase patient recruitment rates and an important issue for discovery of new treatment interventions. The current practice of identifying eligible participants is highly constrained due to manual processing of disparate sources of unstructured patient data. Informatics-based approaches can simplify the complex task of evaluating patient’s eligibility for clinical trials. We show that an ontology-based approach can address the challenge of matching patients to suitable clinical trials. Methods: The free-text descriptions of clinical trial criteria as well as patient data were analysed. A set of common inclusion and exclusion criteria was identified through consultations with expert clinical trial coordinators. A research prototype was developed using Unstructured Information Management Architecture (UIMA) that identified SNOMED CT concepts in the patient data and clinical trial description. The SNOMED CT concepts model the standard clinical terminology that can be used to represent and evaluate patient’s inclusion/exclusion criteria for the clinical trial. Results: Our experimental research prototype describes a semi-automated method for filtering patient records using common clinical trial criteria. Our method simplified the patient recruitment process. The discussion with clinical trial coordinators showed that the efficiency in patient recruitment process measured in terms of information processing time could be improved by 25%. Conclusion: An UIMA-based approach can resolve complexities in patient recruitment for advanced prostate cancer clinical trials.