31 resultados para event detection algorithm

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Any image processing object detection algorithm somehow tries to integrate the object light (Recognition Step) and applies statistical criteria to distinguish objects of interest from other objects or from pure background (Decision Step). There are various possibilities how these two basic steps can be realized, as can be seen in the different proposed detection methods in the literature. An ideal detection algorithm should provide high recognition sensitiv ity with high decision accuracy and require a reasonable computation effort . In reality, a gain in sensitivity is usually only possible with a loss in decision accuracy and with a higher computational effort. So, automatic detection of faint streaks is still a challenge. This paper presents a detection algorithm using spatial filters simulating the geometrical form of possible streaks on a CCD image. This is realized by image convolution. The goal of this method is to generate a more or less perfect match between a streak and a filter by varying the length and orientation of the filters. The convolution answers are accepted or rejected according to an overall threshold given by the ackground statistics. This approach yields as a first result a huge amount of accepted answers due to filters partially covering streaks or remaining stars. To avoid this, a set of additional acceptance criteria has been included in the detection method. All criteria parameters are justified by background and streak statistics and they affect the detection sensitivity only marginally. Tests on images containing simulated streaks and on real images containing satellite streaks show a very promising sensitivity, reliability and running speed for this detection method. Since all method parameters are based on statistics, the true alarm, as well as the false alarm probability, are well controllable. Moreover, the proposed method does not pose any extraordinary demands on the computer hardware and on the image acquisition process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Although it seems plausible that sports performance relies on high-acuity foveal vision, it could be empirically shown that myoptic blur (up to +2 diopters) does not harm performance in sport tasks that require foveal information pick-up like golf putting (Bulson, Ciuffreda, & Hung, 2008). How myoptic blur affects peripheral performance is yet unknown. Attention might be less needed for processing visual cues foveally and lead to better performance because peripheral cues are better processed as a function of reduced foveal vision, which will be tested in the current experiment. Methods: 18 sport science students with self-reported myopia volunteered as participants, all of them regularly wearing contact lenses. Exclusion criteria comprised visual correction other than myopic, correction of astigmatism and use of contact lenses out of Swiss delivery area. For each of the participants, three pairs of additional contact lenses (besides their regular lenses; used in the “plano” condition) were manufactured with an individual overcorrection to a retinal defocus of +1 to +3 diopters (referred to as “+1.00 D”, “+2.00 D”, and “+3.00 D” condition, respectively). Gaze data were acquired while participants had to perform a multiple object tracking (MOT) task that required to track 4 out of 10 moving stimuli. In addition, in 66.7 % of all trials, one of the 4 targets suddenly stopped during the motion phase for a period of 0.5 s. Stimuli moved in front of a picture of a sports hall to allow for foveal processing. Due to the directional hypotheses, the level of significance for one-tailed tests on differences was set at α = .05 and posteriori effect sizes were computed as partial eta squares (ηρ2). Results: Due to problems with the gaze-data collection, 3 participants had to be excluded from further analyses. The expectation of a centroid strategy was confirmed because gaze was closer to the centroid than the target (all p < .01). In comparison to the plano baseline, participants more often recalled all 4 targets under defocus conditions, F(1,14) = 26.13, p < .01, ηρ2 = .65. The three defocus conditions differed significantly, F(2,28) = 2.56, p = .05, ηρ2 = .16, with a higher accuracy as a function of a defocus increase and significant contrasts between conditions +1.00 D and +2.00 D (p = .03) and +1.00 D and +3.00 D (p = .03). For stop trials, significant differences could neither be found between plano baseline and defocus conditions, F(1,14) = .19, p = .67, ηρ2 = .01, nor between the three defocus conditions, F(2,28) = 1.09, p = .18, ηρ2 = .07. Participants reacted faster in “4 correct+button” trials under defocus than under plano-baseline conditions, F(1,14) = 10.77, p < .01, ηρ2 = .44. The defocus conditions differed significantly, F(2,28) = 6.16, p < .01, ηρ2 = .31, with shorter response times as a function of a defocus increase and significant contrasts between +1.00 D and +2.00 D (p = .01) and +1.00 D and +3.00 D (p < .01). Discussion: The results show that gaze behaviour in MOT is not affected to a relevant degree by a visual overcorrection up to +3 diopters. Hence, it can be taken for granted that peripheral event detection was investigated in the present study. This overcorrection, however, does not harm the capability to peripherally track objects. Moreover, if an event has to be detected peripherally, neither response accuracy nor response time is negatively affected. Findings could claim considerable relevance for all sport situations in which peripheral vision is required which now needs applied studies on this topic. References: Bulson, R. C., Ciuffreda, K. J., & Hung, G. K. (2008). The effect of retinal defocus on golf putting. Ophthalmic and Physiological Optics, 28, 334-344.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We investigate the problem of distributed sensors' failure detection in networks with a small number of defective sensors, whose measurements differ significantly from the neighbor measurements. We build on the sparse nature of the binary sensor failure signals to propose a novel distributed detection algorithm based on gossip mechanisms and on Group Testing (GT), where the latter has been used so far in centralized detection problems. The new distributed GT algorithm estimates the set of scattered defective sensors with a low complexity distance decoder from a small number of linearly independent binary messages exchanged by the sensors. We first consider networks with one defective sensor and determine the minimal number of linearly independent messages needed for its detection with high probability. We then extend our study to the multiple defective sensors detection by modifying appropriately the message exchange protocol and the decoding procedure. We show that, for small and medium sized networks, the number of messages required for successful detection is actually smaller than the minimal number computed theoretically. Finally, simulations demonstrate that the proposed method outperforms methods based on random walks in terms of both detection performance and convergence rate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE : For the facilitation of minimally invasive robotically performed direct cochlea access (DCA) procedure, a surgical planning tool which enables the surgeon to define landmarks for patient-to-image registration, identify the necessary anatomical structures and define a safe DCA trajectory using patient image data (typically computed tomography (CT) or cone beam CT) is required. To this end, a dedicated end-to-end software planning system for the planning of DCA procedures that addresses current deficiencies has been developed. METHODS :    Efficient and robust anatomical segmentation is achieved through the implementation of semiautomatic algorithms; high-accuracy patient-to-image registration is achieved via an automated model-based fiducial detection algorithm and functionality for the interactive definition of a safe drilling trajectory based on case-specific drill positioning uncertainty calculations was developed. RESULTS :    The accuracy and safety of the presented software tool were validated during the conduction of eight DCA procedures performed on cadaver heads. The plan for each ear was completed in less than 20 min, and no damage to vital structures occurred during the procedures. The integrated fiducial detection functionality enabled final positioning accuracies of [Formula: see text] mm. CONCLUSIONS :    Results of this study demonstrated that the proposed software system could aid in the safe planning of a DCA tunnel within an acceptable time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This book will serve as a foundation for a variety of useful applications of graph theory to computer vision, pattern recognition, and related areas. It covers a representative set of novel graph-theoretic methods for complex computer vision and pattern recognition tasks. The first part of the book presents the application of graph theory to low-level processing of digital images such as a new method for partitioning a given image into a hierarchy of homogeneous areas using graph pyramids, or a study of the relationship between graph theory and digital topology. Part II presents graph-theoretic learning algorithms for high-level computer vision and pattern recognition applications, including a survey of graph based methodologies for pattern recognition and computer vision, a presentation of a series of computationally efficient algorithms for testing graph isomorphism and related graph matching tasks in pattern recognition and a new graph distance measure to be used for solving graph matching problems. Finally, Part III provides detailed descriptions of several applications of graph-based methods to real-world pattern recognition tasks. It includes a critical review of the main graph-based and structural methods for fingerprint classification, a new method to visualize time series of graphs, and potential applications in computer network monitoring and abnormal event detection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Clinical assessments after Total Knee Arthroplasty (TKA) show persisting pain after implantation in over 20% of patients. Impingement of soft tissue around the knee, due to imprecise geometry of the tibial implant, can be one reason for persisting ailment. Two hundred and thirty seven MRI scans were evaluated using an active contour detection algorithm (snake) to obtain a high-resolution mean anatomical shape of the tibial plateau. Differences between female and male, older and younger (40) and left and right averaged shapes were determined. The shapes obtained were asymmetric throughout. Absolute differences between the subgroups fell short of inter-individual variations represented by calculated one-sigma confidence intervals. Our results indicate that a differentiation in TKA tibial plateau design by gender, age, or side is of minor relevance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The acquisition of conventional X-ray radiographs remains the standard imaging procedure for the diagnosis of hip-related problems. However, recent studies demonstrated the benefit of using three-dimensional (3D) surface models in the clinical routine. 3D surface models of the hip joint are useful for assessing the dynamic range of motion in order to identify possible pathologies such as femoroacetabular impingement. In this paper, we present an integrated system which consists of X-ray radiograph calibration and subsequent 2D/3D hip joint reconstruction for diagnosis and planning of hip-related problems. A mobile phantom with two different sizes of fiducials was developed for X-ray radiograph calibration, which can be robustly detected within the images. On the basis of the calibrated X-ray images, a 3D reconstruction method of the acetabulum was developed and applied together with existing techniques to reconstruct a 3D surface model of the hip joint. X-ray radiographs of dry cadaveric hip bones and one cadaveric specimen with soft tissue were used to prove the robustness of the developed fiducial detection algorithm. Computed tomography scans of the cadaveric bones were used to validate the accuracy of the integrated system. The fiducial detection sensitivity was in the same range for both sizes of fiducials. While the detection sensitivity was 97.96% for the large fiducials, it was 97.62% for the small fiducials. The acetabulum and the proximal femur were reconstructed with a mean surface distance error of 1.06 and 1.01 mm, respectively. The results for fiducial detection sensitivity and 3D surface reconstruction demonstrated the capability of the integrated system for 3D hip joint reconstruction from 2D calibrated X-ray radiographs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Various applications for the purposes of event detection, localization, and monitoring can benefit from the use of wireless sensor networks (WSNs). Wireless sensor networks are generally easy to deploy, with flexible topology and can support diversity of tasks thanks to the large variety of sensors that can be attached to the wireless sensor nodes. To guarantee the efficient operation of such a heterogeneous wireless sensor networks during its lifetime an appropriate management is necessary. Typically, there are three management tasks, namely monitoring, (re) configuration, and code updating. On the one hand, status information, such as battery state and node connectivity, of both the wireless sensor network and the sensor nodes has to be monitored. And on the other hand, sensor nodes have to be (re)configured, e.g., setting the sensing interval. Most importantly, new applications have to be deployed as well as bug fixes have to be applied during the network lifetime. All management tasks have to be performed in a reliable, time- and energy-efficient manner. The ability to disseminate data from one sender to multiple receivers in a reliable, time- and energy-efficient manner is critical for the execution of the management tasks, especially for code updating. Using multicast communication in wireless sensor networks is an efficient way to handle such traffic pattern. Due to the nature of code updates a multicast protocol has to support bulky traffic and endto-end reliability. Further, the limited resources of wireless sensor nodes demand an energy-efficient operation of the multicast protocol. Current data dissemination schemes do not fulfil all of the above requirements. In order to close the gap, we designed the Sensor Node Overlay Multicast (SNOMC) protocol such that to support a reliable, time-efficient and energy-efficient dissemination of data from one sender node to multiple receivers. In contrast to other multicast transport protocols, which do not support reliability mechanisms, SNOMC supports end-to-end reliability using a NACK-based reliability mechanism. The mechanism is simple and easy to implement and can significantly reduce the number of transmissions. It is complemented by a data acknowledgement after successful reception of all data fragments by the receiver nodes. In SNOMC three different caching strategies are integrated for an efficient handling of necessary retransmissions, namely, caching on each intermediate node, caching on branching nodes, or caching only on the sender node. Moreover, an option was included to pro-actively request missing fragments. SNOMC was evaluated both in the OMNeT++ simulator and in our in-house real-world testbed and compared to a number of common data dissemination protocols, such as Flooding, MPR, TinyCubus, PSFQ, and both UDP and TCP. The results showed that SNOMC outperforms the selected protocols in terms of transmission time, number of transmitted packets, and energy-consumption. Moreover, we showed that SNOMC performs well with different underlying MAC protocols, which support different levels of reliability and energy-efficiency. Thus, SNOMC can offer a robust, high-performing solution for the efficient distribution of code updates and management information in a wireless sensor network. To address the three management tasks, in this thesis we developed the Management Architecture for Wireless Sensor Networks (MARWIS). MARWIS is specifically designed for the management of heterogeneous wireless sensor networks. A distinguished feature of its design is the use of wireless mesh nodes as backbone, which enables diverse communication platforms and offloading functionality from the sensor nodes to the mesh nodes. This hierarchical architecture allows for efficient operation of the management tasks, due to the organisation of the sensor nodes into small sub-networks each managed by a mesh node. Furthermore, we developed a intuitive -based graphical user interface, which allows non-expert users to easily perform management tasks in the network. In contrast to other management frameworks, such as Mate, MANNA, TinyCubus, or code dissemination protocols, such as Impala, Trickle, and Deluge, MARWIS offers an integrated solution monitoring, configuration and code updating of sensor nodes. Integration of SNOMC into MARWIS further increases performance efficiency of the management tasks. To our knowledge, our approach is the first one, which offers a combination of a management architecture with an efficient overlay multicast transport protocol. This combination of SNOMC and MARWIS supports reliably, time- and energy-efficient operation of a heterogeneous wireless sensor network.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A novel computerized algorithm for hip joint motion simulation and collision detection, called the Equidistant Method, has been developed. This was compared to three pre-existing methods having different properties regarding definition of the hip joint center and behavior after collision detection. It was proposed that the Equidistant Method would be most accurate in detecting the location and extent of femoroacetabular impingement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cardiovascular event rates have been shown to increase substantially with the number of symptomatic disease locations. We sought to assess the risk profile, management and subsequent event rates of polyvascular disease patients. Consecutive outpatients were assessed for atherosclerotic risk factors and medications in the REACH Registry. A total of 19,117 symptomatic patients in Europe completed a 2-year follow-up: 77.2% with single arterial bed disease (coronary artery or cerebrovascular or peripheral arterial disease) and 22.8% with polyvascular disease (>/= 1 disease location). Polyvascular disease patients were older (68.5 +/- 9.4 vs 66.3 +/- 9.9 years, p < 0.0001), more often current or former smokers (64.9% vs 58.7%, p < 0.0001), and more often suffered from hypertension (59.5% vs 46.6%, p < 0.0001) and diabetes (34.5% vs 25.9%, p < 0.0001) than single arterial bed disease patients. Despite more intense medical therapy, risk factors (smoking, hypertension, low fasting glucose, and low fasting total cholesterol) were less often controlled in polyvascular disease patients. This was associated with substantially more events over 2 years compared with single arterial bed disease patients (cMACCE [cardiovascular death/non-fatal stroke/non-fatal MI] odds ratio, 1.63 [95% CI, 1.45-1.83], p < 0.0001). In conclusion, polyvascular disease patients have more cardiovascular risk factors, and the prognosis for these patients is significantly worse than for patients with single arterial bed disease. This suggests a need to improve detection and consequent medical treatment of polyvascular disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cardiogoniometry (CGM), a spatiotemporal electrocardiologic 5-lead method with automated analysis, may be useful in primary healthcare for detecting coronary artery disease (CAD) at rest. Our aim was to systematically develop a stenosis-specific parameter set for global CAD detection. In 793 consecutively admitted patients with presumed non-acute CAD, CGM data were collected prior to elective coronary angiography and analyzed retrospectively. 658 patients fulfilled the inclusion criteria, 405 had CAD verified by coronary angiography; the 253 patients with normal coronary angiograms served as the non-CAD controls. Study patients--matched for age, BMI, and gender--were angiographically assigned to 8 stenosis-specific CAD categories or to the controls. One CGM parameter possessing significance (P < .05) and the best diagnostic accuracy was matched to one CAD category. The area under the ROC curve was .80 (global CAD versus controls). A set containing 8 stenosis-specific CGM parameters described variability of R vectors and R-T angles, spatial position and potential distribution of R/T vectors, and ST/T segment alterations. Our parameter set systematically combines CAD categories into an algorithm that detects CAD globally. Prospective validation in clinical studies is ongoing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An automated algorithm for detection of the acetabular rim was developed. Accuracy of the algorithm was validated in a sawbone study and compared against manually conducted digitization attempts, which were established as the ground truth. The latter proved to be reliable and reproducible, demonstrated by almost perfect intra- and interobserver reliability. Validation of the automated algorithm showed no significant difference compared to the manually acquired data in terms of detected version and inclination. Automated detection of the acetabular rim contour and the spatial orientation of the acetabular opening plane can be accurately achieved with this algorithm.