948 resultados para gravitational capture


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a hybrid method to track human motions in real-time. With simplified marker sets and monocular video input, the strength of both marker-based and marker-free motion capturing are utilized: A cumbersome marker calibration is avoided while the robustness of the marker-free tracking is enhanced by referencing the tracked marker positions. An improved inverse kinematics solver is employed for real-time pose estimation. A computer-visionbased approach is applied to refine the pose estimation and reduce the ambiguity of the inverse kinematics solutions. We use this hybrid method to capture typical table tennis upper body movements in a real-time virtual reality application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES This study was conducted to determine if an additional procedural endpoint of unexcitability (UE) to pacing along the ablation line reduces recurrence of atrial fibrillation (AF) or atrial tachycardia (AT) after radiofrequency catheter ablation. BACKGROUND AF/AT recurrence is common after pulmonary vein isolation (PVI). METHODS We included 102 patients from 2 centers (age 63 ± 10 years; 33 women; left atrium 38 ± 7 mm; left ventricular ejection fraction 61 ± 6%) with symptomatic paroxysmal AF. A 3-dimensional mapping system and circumferential mapping catheter were used in all patients for PVI. In group 1 (n = 50), the procedural endpoint was bidirectional block across the ablation line. In group 2 (n = 52), additional UE to bipolar pacing at an output of 10 mA and 2-ms pulse width was required. The primary endpoint was freedom from any AF/AT (>30 s) after discontinuation of antiarrhythmic drugs. RESULTS Procedural endpoints were successfully achieved in all patients. Procedure duration was significantly longer in group 2 (185 ± 58 min vs. 139 ± 57 min; p < 0.001); however, fluoroscopy times were not different (23 ± 9 min vs. 23 ± 9 min; p = 0.49). After a follow-up of 12 months in all patients, 26 patients (52%) in group 1 versus 43 (82.7%) in group 2 were free from any AF/AT (p = 0.001) after a single procedure. No major complications occurred. CONCLUSIONS The use of pacing to ensure UE along the PVI line markedly improved near-term single-procedure success, compared with demonstration of bidirectional block alone. This additional endpoint significantly improved patient outcomes after PVI. (Unexcitability Along the Ablation as an Endpoint for Atrial Fibrillation Ablation; NCT01724437).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by process-based modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under http://www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws. We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10 m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25 m resolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We studied the influence of surveyed area size on density estimates by means of camera-trapping in a low-density felid population (1-2 individuals/100 km(2) ). We applied non-spatial capture-recapture (CR) and spatial CR (SCR) models for Eurasian lynx during winter 2005/2006 in the northwestern Swiss Alps by sampling an area divided into 5 nested plots ranging from 65 to 760 km(2) . CR model density estimates (95% CI) for models M0 and Mh decreased from 2.61 (1.55-3.68) and 3.6 (1.62-5.57) independent lynx/100 km(2) , respectively, in the smallest to 1.20 (1.04-1.35) and 1.26 (0.89-1.63) independent lynx/100 km(2) , respectively, in the largest area surveyed. SCR model density estimates also decreased with increasing sampling area but not significantly. High individual range overlaps in relatively small areas (the edge effect) is the most plausible reason for this positive bias in the CR models. Our results confirm that SCR models are much more robust to changes in trap array size than CR models, thus avoiding overestimation of density in smaller areas. However, when a study is concerned with monitoring population changes, large spatial efforts (area surveyed ≥760 km(2) ) are required to obtain reliable and precise density estimates with these population densities and recapture rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Correct estimation of the firn lock-in depth is essential for correctly linking gas and ice chronologies in ice core studies. Here, two approaches to constrain the firn depth evolution in Antarctica are presented over the last deglaciation: outputs of a firn densification model, and measurements of δ15N of N2 in air trapped in ice core, assuming that δ15N is only affected by gravitational fractionation in the firn column. Since the firn densification process is largely governed by surface temperature and accumulation rate, we have investigated four ice cores drilled in coastal (Berkner Island, BI, and James Ross Island, JRI) and semi-coastal (TALDICE and EPICA Dronning Maud Land, EDML) Antarctic regions. Combined with available ice core air-δ15N measurements from the EPICA Dome C (EDC) site, the studied regions encompass a large range of surface accumulation rates and temperature conditions. Our δ15N profiles reveal a heterogeneous response of the firn structure to glacial–interglacial climatic changes. While firn densification simulations correctly predict TALDICE δ15N variations, they systematically fail to capture the large millennial-scale δ15N variations measured at BI and the δ15N glacial levels measured at JRI and EDML – a mismatch previously reported for central East Antarctic ice cores. New constraints of the EDML gas–ice depth offset during the Laschamp event (~41 ka) and the last deglaciation do not favour the hypothesis of a large convective zone within the firn as the explanation of the glacial firn model–δ15N data mismatch for this site. While we could not conduct an in-depth study of the influence of impurities in snow for firnification from the existing datasets, our detailed comparison between the δ15N profiles and firn model simulations under different temperature and accumulation rate scenarios suggests that the role of accumulation rate may have been underestimated in the current description of firnification models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The short-lived 182Hf–182W isotope system can provide powerful constraints on the timescales of planetary core formation, but its application to iron meteorites is hampered by neutron capture reactions on W isotopes resulting from exposure to galactic cosmic rays. Here we show that Pt isotopes in magmatic iron meteorites are also affected by capture of (epi)thermal neutrons and that the Pt isotope variations are correlated with variations in 182W/184W. This makes Pt isotopes a sensitive neutron dosimeter for correcting cosmic ray-induced W isotope shifts. The pre-exposure 182W/184W derived from the Pt–W isotope correlations of the IID, IVA and IVB iron meteorites are higher than most previous estimates and are more radiogenic than the initial 182W/184W of Ca–Al-rich inclusions (CAI). The Hf–W model ages for core formation range from +1.6±1.0 million years (Ma; for the IVA irons) to +2.7±1.3 Ma after CAI formation (for the IID irons), indicating that there was a time gap of at least ∼1 Ma between CAI formation and metal segregation in the parent bodies of some iron meteorites. From the Hf–W ages a time limit of <1.5–2 Ma after CAI formation can be inferred for the accretion of the IID, IVA and IVB iron meteorite parent bodies, consistent with earlier conclusions that the accretion of differentiated planetesimals predated that of most chondrite parent bodies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neutron capture effects in meteorites and lunar surface samples have been successfully used in the past to study exposure histories and shielding conditions. In recent years, however, it turned out that neutron capture effects produce a nuisance for some of the short-lived radionuclide systems. The most prominent example is the 182Hf-182W system in iron meteorites, for which neutron capture effects lower the 182W/184W ratio, thereby producing too old apparent ages. Here, we present a thorough study of neutron capture effects in iron meteorites, ordinary chondrites, and carbonaceous chondrites, whereas the focus is on iron meteorites. We study in detail the effects responsible for neutron production, neutron transport, and neutron slowing down and find that neutron capture in all studied meteorite types is not, as usually expected, exclusively via thermal neutrons. In contrast, most of the neutron capture in iron meteorites is in the epithermal energy range and there is a significant contribution from epithermal neutron capture even in stony meteorites. Using sophisticated particle spectra and evaluated cross section data files for neutron capture reactions we calculate the neutron capture effects for Sm, Gd, Cd, Pd, Pt, and Os isotopes, which all can serve as neutron-dose proxies, either in stony or in iron meteorites. In addition, we model neutron capture effects in W and Ag isotopes. For W isotopes, the GCR-induced shifts perfectly correlate with Os and Pt isotope shifts, which therefore can be used as neutron-dose proxies and permit a reliable correction. We also found that GCR-induced effects for the 107Pd-107Ag system can be significant and need to be corrected, a result that is in contrast to earlier studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main goal of the AEgIS experiment at CERN is to test the weak equivalence principle for antimatter. AEgIS will measure the free-fall of an antihydrogen beam traversing a moir'e deflectometer. The goal is to determine the gravitational acceleration with an initial relative accuracy of 1% by using an emulsion detector combined with a silicon μ-strip detector to measure the time of flight. Nuclear emulsions can measure the annihilation vertex of antihydrogen atoms with a precision of ~ 1–2 μm r.m.s. We present here results for emulsion detectors operated in vacuum using low energy antiprotons from the CERN antiproton decelerator. We compare with Monte Carlo simulations, and discuss the impact on the AEgIS project.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose to build and operate a detector based on the emulsion film technology for the measurement of the gravitational acceleration on antimatter, to be performed by the AEgIS experiment (AD6) at CERN. The goal of AEgIS is to test the weak equivalence principle with a precision of 1% on the gravitational acceleration g by measuring the vertical position of the annihilation vertex of antihydrogen atoms after their free fall while moving horizontally in a vacuum pipe. With the emulsion technology developed at the University of Bern we propose to improve the performance of AEgIS by exploiting the superior position resolution of emulsion films over other particle detectors. The idea is to use a new type of emulsion films, especially developed for applications in vacuum, to yield a spatial resolution of the order of one micron in the measurement of the sag of the antihydrogen atoms in the gravitational field. This is an order of magnitude better than what was planned in the original AEgIS proposal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technological advances in gear and fishing practices have driven the global expansion of the American lobster live seafood market. These changes have had a positive effect on the lobster industry by increasing capture efficiency. However, it is unknown what effect these improved methods will have on the post-capture fitness and survival of lobsters. This project utilized a repeated measures design to compare the physiological changes that occur in lobsters over time as the result of differences in depth, hauling rate, and storage methodology. The results indicate that lobsters destined for long distance transport or temporary storage in pounds undergo physiological disturbance as part of the capture process. These changes are significant over time for total hemocyte counts, crustacean hyperglycemic hormone, L-lactate, ammonia, and glucose. Repeated measures multivariate analysis of variance (MANOVA) for glucose indicates a significant interaction between depth and storage methodology over time for non-survivors. A Gram-negative bacterium, Photobacterium indicum, was identified in pure culture from hemolymph samples of 100% of weak lobsters. Histopathology revealed the presence of Gram-negative bacteria throughout the tissues with evidence of antemortem edema and necrosis suggestive of septicemia. On the basis of these findings, we recommend to the lobster industry that if a reduction in depth and hauling rate is not economically feasible, fishermen should take particular care in handling lobsters and provide them with a recovery period in recirculating seawater prior to land transport. The ecological role of P. indicum is not fully defined at this time. However, it may be an emerging opportunistic pathogen of stressed lobsters. Judicious preemptive antibiotic therapy may be necessary to reduce mortality in susceptible lobsters destined for high-density holding facilities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electron microscopy (EM) allows for the simultaneous visualization of all tissue components at high resolution. However, the extent to which conventional aldehyde fixation and ethanol dehydration of the tissue alter the fine structure of cells and organelles, thereby preventing detection of subtle structural changes induced by an experiment, has remained an issue. Attempts have been made to rapidly freeze tissue to preserve native ultrastructure. Shock-freezing of living tissue under high pressure (high-pressure freezing, HPF) followed by cryosubstitution of the tissue water avoids aldehyde fixation and dehydration in ethanol; the tissue water is immobilized in ∼50 ms, and a close-to-native fine structure of cells, organelles and molecules is preserved. Here we describe a protocol for HPF that is useful to monitor ultrastructural changes associated with functional changes at synapses in the brain but can be applied to many other tissues as well. The procedure requires a high-pressure freezer and takes a minimum of 7 d but can be paused at several points.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: There is converging evidence for the notion that pain affects a broad range of attentional domains. This study investigated the influence of pain on the involuntary capture of attention as indexed by the P3a component in the event-related potential derived from the electroencephalogram. METHODS: Participants performed in an auditory oddball task in a pain-free and a pain condition during which they submerged a hand in cold water. Novel, infrequent and unexpected auditory stimuli were presented randomly in a series of frequent standard and infrequent target tones. P3a and P3b amplitudes were observed to novel, unexpected and target-related stimuli, respectively. RESULTS: Both electrophysiological components were characterized by reduced amplitudes in the pain compared with the pain-free condition. Hit rate and reaction time to target stimuli did not differ between the two conditions presumably because the experimental task was not difficult enough to exceed attentional capacities under pain conditions. CONCLUSIONS: These results indicate that voluntary attention serving the maintenance and control of ongoing information processing (reflected by the P3b amplitude) is impaired by pain. In addition, the involuntary capture of attention and orientation to novel, unexpected information (measured by the P3a) is also impaired by pain. Thus, neurophysiological measures examined in this study support the theoretical positions proposing that pain can reduce attentional processing capacity. These findings have potentially important implications at the theoretical level for our understanding of the interplay of pain and cognition, and at the therapeutic level for the clinical treatment of individuals experiencing ongoing pain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Antihydrogen holds the promise to test, for the first time, the universality of freefall with a system composed entirely of antiparticles. The AEgIS experiment at CERN’s antiproton decelerator aims to measure the gravitational interaction between matter and antimatter by measuring the deflection of a beam of antihydrogen in the Earths gravitational field (g). The principle of the experiment is as follows: cold antihydrogen atoms are synthesized in a Penning-Malberg trap and are Stark accelerated towards a moir´e deflectometer, the classical counterpart of an atom interferometer, and annihilate on a position sensitive detector. Crucial to the success of the experiment is the spatial precision of the position sensitive detector.We propose a novel free-fall detector based on a hybrid of two technologies: emulsion detectors, which have an intrinsic spatial resolution of 50 nm but no temporal information, and a silicon strip / scintillating fiber tracker to provide timing and positional information. In 2012 we tested emulsion films in vacuum with antiprotons from CERN’s antiproton decelerator. The annihilation vertices could be observed directly on the emulsion surface using the microscope facility available at the University of Bern. The annihilation vertices were successfully reconstructed with a resolution of 1–2 μmon the impact parameter. If such a precision can be realized in the final detector, Monte Carlo simulations suggest of order 500 antihydrogen annihilations will be sufficient to determine gwith a 1 % accuracy. This paper presents current research towards the development of this technology for use in the AEgIS apparatus and prospects for the realization of the final detector.