899 resultados para Synchronization algorithms
Resumo:
In this work, a method that synchronizes two video sequences is proposed. Unlike previous methods, which require the existence of correspondences between features tracked in the two sequences, and/or that the cameras are static or jointly moving, the proposed approach does not impose any of these constraints. It works when the cameras move independently, even if different features are tracked in the two sequences. The assumptions underlying the proposed strategy are that the intrinsic parameters of the cameras are known and that two rigid objects, with independent motions on the scene, are visible in both sequences. The relative motion between these objects is used as clue for the synchronization. The extrinsic parameters of the cameras are assumed to be unknown. A new synchronization algorithm for static or jointly moving cameras that see (possibly) different parts of a common rigidly moving object is also proposed. Proof-of-concept experiments that illustrate the performance of these methods are presented, as well as a comparison with a state-of-the-art approach.
Resumo:
Long-term electrocardiogram (ECG) often suffers from relevant noise. Baseline wander in particular is pronounced in ECG recordings using dry or esophageal electrodes, which are dedicated for prolonged registration. While analog high-pass filters introduce phase distortions, reliable offline filtering of the baseline wander implies a computational burden that has to be put in relation to the increase in signal-to-baseline ratio (SBR). Here we present a graphics processor unit (GPU) based parallelization method to speed up offline baseline wander filter algorithms, namely the wavelet, finite, and infinite impulse response, moving mean, and moving median filter. Individual filter parameters were optimized with respect to the SBR increase based on ECGs from the Physionet database superimposed to auto-regressive modeled, real baseline wander. A Monte-Carlo simulation showed that for low input SBR the moving median filter outperforms any other method but negatively affects ECG wave detection. In contrast, the infinite impulse response filter is preferred in case of high input SBR. However, the parallelized wavelet filter is processed 500 and 4 times faster than these two algorithms on the GPU, respectively, and offers superior baseline wander suppression in low SBR situations. Using a signal segment of 64 mega samples that is filtered as entire unit, wavelet filtering of a 7-day high-resolution ECG is computed within less than 3 seconds. Taking the high filtering speed into account, the GPU wavelet filter is the most efficient method to remove baseline wander present in long-term ECGs, with which computational burden can be strongly reduced.
Resumo:
In the fermion loop formulation the contributions to the partition function naturally separate into topological equivalence classes with a definite sign. This separation forms the basis for an efficient fermion simulation algorithm using a fluctuating open fermion string. It guarantees sufficient tunnelling between the topological sectors, and hence provides a solution to the fermion sign problem affecting systems with broken supersymmetry. Moreover, the algorithm shows no critical slowing down even in the massless limit and can hence handle the massless Goldstino mode emerging in the supersymmetry broken phase. In this paper – the third in a series of three – we present the details of the simulation algorithm and demonstrate its efficiency by means of a few examples.
Resumo:
We present new algorithms for M-estimators of multivariate scatter and location and for symmetrized M-estimators of multivariate scatter. The new algorithms are considerably faster than currently used fixed-point and related algorithms. The main idea is to utilize a second order Taylor expansion of the target functional and to devise a partial Newton-Raphson procedure. In connection with symmetrized M-estimators we work with incomplete U-statistics to accelerate our procedures initially.
Resumo:
This paper presents a shallow dialogue analysis model, aimed at human-human dialogues in the context of staff or business meetings. Four components of the model are defined, and several machine learning techniques are used to extract features from dialogue transcripts: maximum entropy classifiers for dialogue acts, latent semantic analysis for topic segmentation, or decision tree classifiers for discourse markers. A rule-based approach is proposed for solving cross-modal references to meeting documents. The methods are trained and evaluated thanks to a common data set and annotation format. The integration of the components into an automated shallow dialogue parser opens the way to multimodal meeting processing and retrieval applications.
Resumo:
BACKGROUND Lung clearance index (LCI), a marker of ventilation inhomogeneity, is elevated early in children with cystic fibrosis (CF). However, in infants with CF, LCI values are found to be normal, although structural lung abnormalities are often detectable. We hypothesized that this discrepancy is due to inadequate algorithms of the available software package. AIM Our aim was to challenge the validity of these software algorithms. METHODS We compared multiple breath washout (MBW) results of current software algorithms (automatic modus) to refined algorithms (manual modus) in 17 asymptomatic infants with CF, and 24 matched healthy term-born infants. The main difference between these two analysis methods lies in the calculation of the molar mass differences that the system uses to define the completion of the measurement. RESULTS In infants with CF the refined manual modus revealed clearly elevated LCI above 9 in 8 out of 35 measurements (23%), all showing LCI values below 8.3 using the automatic modus (paired t-test comparing the means, P < 0.001). Healthy infants showed normal LCI values using both analysis methods (n = 47, paired t-test, P = 0.79). The most relevant reason for false normal LCI values in infants with CF using the automatic modus was the incorrect recognition of the end-of-test too early during the washout. CONCLUSION We recommend the use of the manual modus for the analysis of MBW outcomes in infants in order to obtain more accurate results. This will allow appropriate use of infant lung function results for clinical and scientific purposes.
Resumo:
BACKGROUND: HIV surveillance requires monitoring of new HIV diagnoses and differentiation of incident and older infections. In 2008, Switzerland implemented a system for monitoring incident HIV infections based on the results of a line immunoassay (Inno-Lia) mandatorily conducted for HIV confirmation and type differentiation (HIV-1, HIV-2) of all newly diagnosed patients. Based on this system, we assessed the proportion of incident HIV infection among newly diagnosed cases in Switzerland during 2008-2013. METHODS AND RESULTS: Inno-Lia antibody reaction patterns recorded in anonymous HIV notifications to the federal health authority were classified by 10 published algorithms into incident (up to 12 months) or older infections. Utilizing these data, annual incident infection estimates were obtained in two ways, (i) based on the diagnostic performance of the algorithms and utilizing the relationship 'incident = true incident + false incident', (ii) based on the window-periods of the algorithms and utilizing the relationship 'Prevalence = Incidence x Duration'. From 2008-2013, 3'851 HIV notifications were received. Adult HIV-1 infections amounted to 3'809 cases, and 3'636 of them (95.5%) contained Inno-Lia data. Incident infection totals calculated were similar for the performance- and window-based methods, amounting on average to 1'755 (95% confidence interval, 1588-1923) and 1'790 cases (95% CI, 1679-1900), respectively. More than half of these were among men who had sex with men. Both methods showed a continuous decline of annual incident infections 2008-2013, totaling -59.5% and -50.2%, respectively. The decline of incident infections continued even in 2012, when a 15% increase in HIV notifications had been observed. This increase was entirely due to older infections. Overall declines 2008-2013 were of similar extent among the major transmission groups. CONCLUSIONS: Inno-Lia based incident HIV-1 infection surveillance proved useful and reliable. It represents a free, additional public health benefit of the use of this relatively costly test for HIV confirmation and type differentiation.
Resumo:
Indoor positioning has attracted considerable attention for decades due to the increasing demands for location based services. In the past years, although numerous methods have been proposed for indoor positioning, it is still challenging to find a convincing solution that combines high positioning accuracy and ease of deployment. Radio-based indoor positioning has emerged as a dominant method due to its ubiquitousness, especially for WiFi. RSSI (Received Signal Strength Indicator) has been investigated in the area of indoor positioning for decades. However, it is prone to multipath propagation and hence fingerprinting has become the most commonly used method for indoor positioning using RSSI. The drawback of fingerprinting is that it requires intensive labour efforts to calibrate the radio map prior to experiments, which makes the deployment of the positioning system very time consuming. Using time information as another way for radio-based indoor positioning is challenged by time synchronization among anchor nodes and timestamp accuracy. Besides radio-based positioning methods, intensive research has been conducted to make use of inertial sensors for indoor tracking due to the fast developments of smartphones. However, these methods are normally prone to accumulative errors and might not be available for some applications, such as passive positioning. This thesis focuses on network-based indoor positioning and tracking systems, mainly for passive positioning, which does not require the participation of targets in the positioning process. To achieve high positioning accuracy, we work on some information of radio signals from physical-layer processing, such as timestamps and channel information. The contributions in this thesis can be divided into two parts: time-based positioning and channel information based positioning. First, for time-based indoor positioning (especially for narrow-band signals), we address challenges for compensating synchronization offsets among anchor nodes, designing timestamps with high resolution, and developing accurate positioning methods. Second, we work on range-based positioning methods with channel information to passively locate and track WiFi targets. Targeting less efforts for deployment, we work on range-based methods, which require much less calibration efforts than fingerprinting. By designing some novel enhanced methods for both ranging and positioning (including trilateration for stationary targets and particle filter for mobile targets), we are able to locate WiFi targets with high accuracy solely relying on radio signals and our proposed enhanced particle filter significantly outperforms the other commonly used range-based positioning algorithms, e.g., a traditional particle filter, extended Kalman filter and trilateration algorithms. In addition to using radio signals for passive positioning, we propose a second enhanced particle filter for active positioning to fuse inertial sensor and channel information to track indoor targets, which achieves higher tracking accuracy than tracking methods solely relying on either radio signals or inertial sensors.
Resumo:
Academic and industrial research in the late 90s have brought about an exponential explosion of DNA sequence data. Automated expert systems are being created to help biologists to extract patterns, trends and links from this ever-deepening ocean of information. Two such systems aimed on retrieving and subsequently utilizing phylogenetically relevant information have been developed in this dissertation, the major objective of which was to automate the often difficult and confusing phylogenetic reconstruction process. ^ Popular phylogenetic reconstruction methods, such as distance-based methods, attempt to find an optimal tree topology (that reflects the relationships among related sequences and their evolutionary history) by searching through the topology space. Various compromises between the fast (but incomplete) and exhaustive (but computationally prohibitive) search heuristics have been suggested. An intelligent compromise algorithm that relies on a flexible “beam” search principle from the Artificial Intelligence domain and uses the pre-computed local topology reliability information to adjust the beam search space continuously is described in the second chapter of this dissertation. ^ However, sometimes even a (virtually) complete distance-based method is inferior to the significantly more elaborate (and computationally expensive) maximum likelihood (ML) method. In fact, depending on the nature of the sequence data in question either method might prove to be superior. Therefore, it is difficult (even for an expert) to tell a priori which phylogenetic reconstruction method—distance-based, ML or maybe maximum parsimony (MP)—should be chosen for any particular data set. ^ A number of factors, often hidden, influence the performance of a method. For example, it is generally understood that for a phylogenetically “difficult” data set more sophisticated methods (e.g., ML) tend to be more effective and thus should be chosen. However, it is the interplay of many factors that one needs to consider in order to avoid choosing an inferior method (potentially a costly mistake, both in terms of computational expenses and in terms of reconstruction accuracy.) ^ Chapter III of this dissertation details a phylogenetic reconstruction expert system that selects a superior proper method automatically. It uses a classifier (a Decision Tree-inducing algorithm) to map a new data set to the proper phylogenetic reconstruction method. ^
Resumo:
Background. Diabetes places a significant burden on the health care system. Reduction in blood glucose levels (HbA1c) reduces the risk of complications; however, little is known about the impact of disease management programs on medical costs for patients with diabetes. In 2001, economic costs associated with diabetes totaled $100 billion, and indirect costs totaled $54 billion. ^ Objective. To compare outcomes of nurse case management by treatment algorithms with conventional primary care for glycemic control and cardiovascular risk factors in type 2 diabetic patients in a low-income Mexican American community-based setting, and to compare the cost effectiveness of the two programs. Patient compliance was also assessed. ^ Research design and methods. An observational group-comparison to evaluate a treatment intervention for type 2 diabetes management was implemented at three out-patient health facilities in San Antonio, Texas. All eligible type 2 diabetic patients attending the clinics during 1994–1996 became part of the study. Data were obtained from the study database, medical records, hospital accounting, and pharmacy cost lists, and entered into a computerized database. Three groups were compared: a Community Clinic Nurse Case Manager (CC-TA) following treatment algorithms, a University Clinic Nurse Case Manager (UC-TA) following treatment algorithms, and Primary Care Physicians (PCP) following conventional care practices at a Family Practice Clinic. The algorithms provided a disease management model specifically for hyperglycemia, dyslipidemia, hypertension, and microalbuminuria that progressively moved the patient toward ideal goals through adjustments in medication, self-monitoring of blood glucose, meal planning, and reinforcement of diet and exercise. Cost effectiveness of hemoglobin AI, final endpoints was compared. ^ Results. There were 358 patients analyzed: 106 patients in CC-TA, 170 patients in UC-TA, and 82 patients in PCP groups. Change in hemoglobin A1c (HbA1c) was the primary outcome measured. HbA1c results were presented at baseline, 6 and 12 months for CC-TA (10.4%, 7.1%, 7.3%), UC-TA (10.5%, 7.1%, 7.2%), and PCP (10.0%, 8.5%, 8.7%). Mean patient compliance was 81%. Levels of cost effectiveness were significantly different between clinics. ^ Conclusion. Nurse case management with treatment algorithms significantly improved glycemic control in patients with type 2 diabetes, and was more cost effective. ^
Resumo:
Digital terrain models (DTM) typically contain large numbers of postings, from hundreds of thousands to billions. Many algorithms that run on DTMs require topological knowledge of the postings, such as finding nearest neighbors, finding the posting closest to a chosen location, etc. If the postings are arranged irregu- larly, topological information is costly to compute and to store. This paper offers a practical approach to organizing and searching irregularly-space data sets by presenting a collection of efficient algorithms (O(N),O(lgN)) that compute important topological relationships with only a simple supporting data structure. These relationships include finding the postings within a window, locating the posting nearest a point of interest, finding the neighborhood of postings nearest a point of interest, and ordering the neighborhood counter-clockwise. These algorithms depend only on two sorted arrays of two-element tuples, holding a planimetric coordinate and an integer identification number indicating which posting the coordinate belongs to. There is one array for each planimetric coordinate (eastings and northings). These two arrays cost minimal overhead to create and store but permit the data to remain arranged irregularly.
Resumo:
We here present a synchronization of the NGRIP, GRIP, and GISP2 ice cores based mainly on volcanic events over the period 14.9-32.45 ka b2k (before AD 2000), corresponding to Marine Isotope Stage 2 (MIS 2) and the end of MIS 3. The matching provides a basis for applying the recent NGRIP-based Greenland Ice Core Chronology 2005 (GICC05) time scale to the GRIP and GISP2 ice cores, thereby making it possible to compare the synchronized palaeoclimate profiles of the cores in detail and to identify relative accumulation differences between the cores. Based on the matching, a period of anomalous high accumulation rates in the GISP2 ice core is detected within the period 16.5-18.3 ka b2k. The d18O and [Ca2+] profiles of the three cores are presented on the common GICC05 time scale and generally show excellent agreement across the stadial-interstadial transitions and across the two characteristic dust events in Greenland Stadial 3. However, large differences between the d18O and [Ca2+] profiles of the three cores are seen in the same period as the 7-9% increase in the GISP2 accumulation rate. We conclude that changes of the atmospheric circulation are likely to have occurred in this period, altering the spatial gradients in Greenland and resulting in larger variations between the records.
Resumo:
We here present a synchronization of the NGRIP, GRIP, and GISP2 ice cores based mainly on volcanic events over the period 14.9-32.45 ka b2k (before AD 2000), corresponding to Marine Isotope Stage 2 (MIS 2) and the end of MIS 3. The matching provides a basis for applying the recent NGRIP-based Greenland Ice Core Chronology 2005 (GICC05) time scale to the GRIP and GISP2 ice cores, thereby making it possible to compare the synchronized palaeoclimate profiles of the cores in detail and to identify relative accumulation differences between the cores. Based on the matching, a period of anomalous high accumulation rates in the GISP2 ice core is detected within the period 16.5-18.3 ka b2k. The d18O and [Ca2+] profiles of the three cores are presented on the common GICC05 time scale and generally show excellent agreement across the stadial-interstadial transitions and across the two characteristic dust events in Greenland Stadial 3. However, large differences between the d18O and [Ca2+] profiles of the three cores are seen in the same period as the 7-9% increase in the GISP2 accumulation rate. We conclude that changes of the atmospheric circulation are likely to have occurred in this period, altering the spatial gradients in Greenland and resulting in larger variations between the records.