925 resultados para Free-space method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In free viewpoint applications, the images are captured by an array of cameras that acquire a scene of interest from different perspectives. Any intermediate viewpoint not included in the camera array can be virtually synthesized by the decoder, at a quality that depends on the distance between the virtual view and the camera views available at decoder. Hence, it is beneficial for any user to receive camera views that are close to each other for synthesis. This is however not always feasible in bandwidth-limited overlay networks, where every node may ask for different camera views. In this work, we propose an optimized delivery strategy for free viewpoint streaming over overlay networks. We introduce the concept of layered quality-of-experience (QoE), which describes the level of interactivity offered to clients. Based on these levels of QoE, camera views are organized into layered subsets. These subsets are then delivered to clients through a prioritized network coding streaming scheme, which accommodates for the network and clients heterogeneity and effectively exploit the resources of the overlay network. Simulation results show that, in a scenario with limited bandwidth or channel reliability, the proposed method outperforms baseline network coding approaches, where the different levels of QoE are not taken into account in the delivery strategy optimization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The acquisition of accurate information on the size of traits in animals is fundamental for the study of animal ecology and evolution and their management. We demonstrate how morphological traits of free-ranging animals can reliably be estimated on very large observation distances of several hundred meters by the use of ordinary digital photographic equipment and simple photogrammetric software. In our study, we estimated the length of horn annuli in free-ranging male Alpine ibex (Capra ibex) by taking already measured horn annuli of conspecifics on the same photographs as scaling units. Comparisons with hand-measured horn annuli lengths and repeatability analyses revealed a high accuracy of the photogrammetric estimates. If length estimations of specific horn annuli are based on multiple photographs measurement errors of <5.5 mm can be expected. In the current study the application of the described photogrammetric procedure increased the sample size of animals with known horn annuli length by an additional 104%. The presented photogrammetric procedure is of broad applicability and represents an easy, robust and cost-efficient method for the measuring of individuals in populations where animals are hard to capture or to approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The direct Bayesian admissible region approach is an a priori state free measurement association and initial orbit determination technique for optical tracks. In this paper, we test a hybrid approach that appends a least squares estimator to the direct Bayesian method on measurements taken at the Zimmerwald Observatory of the Astronomical Institute at the University of Bern. Over half of the association pairs agreed with conventional geometric track correlation and least squares techniques. The remaining pairs cast light on the fundamental limits of conducting tracklet association based solely on dynamical and geometrical information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The flavour of foods is determined by the interaction of taste molecules with receptors in the mouth, and fragrances or aroma with receptors in the upper part of the nose. Here, we discuss the properties of taste and fragrance molecules, from the public databases Superscent, Flavornet, SuperSweet and BitterDB, taken collectively as flavours, in the perspective of the chemical space. We survey simple descriptor profiles in comparison with the public collections ChEMBL (bioactive small molecules), ZINC (commercial drug-like molecules) and GDB-13 (all possible organic molecules up to 13 atoms of C, N, O, S, Cl). A global analysis of the chemical space of flavours is also presented based on molecular quantum numbers (MQN) and SMILES fingerprints (SMIfp). While taste molecules span a very broad property range, fragrances occupy a narrow area of the chemical space consisting of generally very small and relatively nonpolar molecules distinct of standard drug molecules. Proximity searching in the chemical space is exemplified as a simple method to facilitate the search for new fragrances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Because computed tomography (CT) has advantages for visualizing the manifestation of necrosis and local complications, a series of scoring systems based on CT manifestations have been developed for assessing the clinical outcomes of acute pancreatitis (AP), including the CT severity index (CTSI), modified CTSI, etc. Despite the internationally accepted CTSI having been successfully used to predict the overall mortality and disease severity of AP, recent literature has revealed the limitations of the CTSI. Using the Delphi method, we establish a new scoring system based on retrocrural space involvement (RCSI), and compared its effectiveness at evaluating the mortality and severity of AP with that of the CTSI. METHODS We reviewed CT images of 257 patients with AP taken within 3-5 days of admission in 2012. The RCSI scoring system, which includes assessment of infectious conditions involving the retrocrural space and the adjacent pleural cavity, was established using the Delphi method. Two radiologists independently assessed the RCSI and CTSI scores. The predictive points of the RCSI and CTSI scoring systems in evaluating the mortality and severity of AP were estimated using receiver operating characteristic (ROC) curves. PRINCIPAL FINDINGS The RCSI score can accurately predict the mortality and disease severity. The area under the ROC curve for the RCSI versus CTSI score was 0.962±0.011 versus 0.900±0.021 for predicting the mortality, and 0.888±0.025 versus 0.904±0.020 for predicting the severity of AP. Applying ROC analysis to our data showed that a RCSI score of 4 was the best cutoff value, above which mortality could be identified. CONCLUSION The Delphi method was innovatively adopted to establish a scoring system to predict the clinical outcome of AP. The RCSI scoring system can predict the mortality of AP better than the CTSI system, and the severity of AP equally as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Free arachidonic acid is functionally interlinked with different lipid signaling networks including those involving prostanoid pathways, the endocannabinoid system, N-acylethanolamines, as well as steroids. A sensitive and specific LC-MS/MS method for the quantification of arachidonic acid, prostaglandin E2, thromboxane B2, anandamide, 2-arachidonoylglycerol, noladin ether, lineoyl ethanolamide, oleoyl ethanolamide, palmitoyl ethanolamide, steroyl ethanolamide, aldosterone, cortisol, dehydroepiandrosterone, progesterone, and testosterone in human plasma was developed and validated. Analytes were extracted using acetonitrile precipitation followed by solid phase extraction. Separations were performed by UFLC using a C18 column and analyzed on a triple quadrupole MS with electron spray ionization. Analytes were run first in negative mode and, subsequently, in positive mode in two independent LC-MS/MS runs. For each analyte, two MRM transitions were collected in order to confirm identity. All analytes showed good linearity over the investigated concentration range (r>0.98). Validated LLOQs ranged from 0.1 to 190ng/mL and LODs ranged from 0.04 to 12.3ng/mL. Our data show that this LC-MS/MS method is suitable for the quantification of a diverse set of bioactive lipids in plasma from human donors (n=32). The determined plasma levels are in agreement with the literature, thus providing a versatile method to explore pathophysiological processes in which changes of these lipids are implicated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel approach to the reconstruction of depth from light field data. Our method uses dictionary representations and group sparsity constraints to derive a convex formulation. Although our solution results in an increase of the problem dimensionality, we keep numerical complexity at bay by restricting the space of solutions and by exploiting an efficient Primal-Dual formulation. Comparisons with state of the art techniques, on both synthetic and real data, show promising performances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Complete-pelvis segmentation in antero-posterior pelvic radiographs is required to create a patient-specific three-dimensional pelvis model for surgical planning and postoperative assessment in image-free navigation of total hip arthroplasty. Methods A fast and robust framework for accurately segmenting the complete pelvis is presented, consisting of two consecutive modules. In the first module, a three-stage method was developed to delineate the left hemipelvis based on statistical appearance and shape models. To handle complex pelvic structures, anatomy-specific information processing techniques were employed. As the input to the second module, the delineated left hemi-pelvis was then reflected about an estimated symmetry line of the radiograph to initialize the right hemi-pelvis segmentation. The right hemi-pelvis was segmented by the same three-stage method, Results Two experiments conducted on respectively 143 and 40 AP radiographs demonstrated a mean segmentation accuracy of 1.61±0.68 mm. A clinical study to investigate the postoperative assessment of acetabular cup orientations based on the proposed framework revealed an average accuracy of 1.2°±0.9° and 1.6°±1.4° for anteversion and inclination, respectively. Delineation of each radiograph costs less than one minute. Conclusions Despite further validation needed, the preliminary results implied the underlying clinical applicability of the proposed framework for image-free THA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: It has been repeatedly demonstrated that athletes in a state of ego depletion do not perform up to their capabilities in high pressure situations. We assume that momentarily available self-control strength determines whether individuals in high pressure situations can resist distracting stimuli. Design/method: In the present study, we applied a between-subjects design, as 31 experienced basketball players were randomly assigned to a depletion group or a non-depletion group. Participants performed 30 free throws while listening to statements representing worrisome thoughts (as frequently experienced in high pressure situations) over stereo headphones. Participants were instructed to block out these distracting audio messages and focus on the free throws. We postulated that depleted participants would be more likely to be distracted. They were also assumed to perform worse in the free throw task. Results: The results supported our assumption as depleted participants paid more attention to the distracting stimuli. In addition, they displayed worse performance in the free throw task. Conclusions: These results indicate that sufficient levels of self-control strength can serve as a buffer against distracting stimuli under pressure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

10.1002/hlca.19900730309.abs In three steps, 2-deoxy-D-ribose has been converted into a phosphoramidite building block bearing a (t-Bu)Me2Si protecting group at the OH function of the anomeric centre of the furanose ring. This building block was subsequently incorporated into DNA oligomers of various base sequences using the standard phosphoramidite protocol for automated DNA synthesis. The resulting silyl-oligomers have been purified by HPLC and selectively desilylated to the corresponding free apurinic DNA sequences. The hexamer d (A-A-A-A-X-A) (X representing the apurinic site) which was prepared in this way was characterized by 1H- and 31P-NMR spectroscopy. The other sequences as well as their fragments, which formed upon treatment with alkali base, were analyzed by polyacrylamide gel electrophoresis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The population of space debris increased drastically during the last years. These objects have become a great threat for active satellites. Because the relative velocities between space debris and satellites are high, space debris objects may destroy active satellites through collisions. Furthermore, collisions involving massive objects produce large number of fragments leading to significant growth of the space debris population. The long term evolution of the debris population is essentially driven by so-called catastrophic collisions. An effective remediation measure in order to stabilize the population in Low Earth Orbit (LEO) is therefore the removal of large, massive space debris. To remove these objects, not only precise orbits, but also more detailed information about their attitude states will be required. One important property of an object targeted for removal is its spin period, spin axis orientation and their change over time. Rotating objects will produce periodic brightness variations with frequencies which are related to the spin periods. Such a brightness variation over time is called a light curve. Collecting, but also processing light curves is challenging due to several reasons. Light curves may be undersampled, low frequency components due to phase angle and atmospheric extinction changes may be present, and beat frequencies may occur when the rotation period is close to a multiple of the sampling period. Depending on the method which is used to extract the frequencies, also method-specific properties have to be taken into account. The astronomical Institute of the University of Bern (AIUB) light curve database will be introduced, which contains more than 1,300 light curves acquired over more than seven years. We will discuss properties and reliability of different time series analysis methods tested and currently used by AIUB for the light curve processing. Extracted frequencies and reconstructed phases for some interesting targets, e.g. GLONASS satellites, for which also SLR data were available for the period confirmation, will be presented. Finally we will present the reconstructed phase and its evolution over time of a High-Area-to-Mass-Ratio (HAMR) object, which AIUB observed for several years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). In this context both the correct associations among the observations, and the orbits of the objects have to be determined. The complexity of the MTT problem is defined by its dimension S. Where S stands for the number of ’fences’ used in the problem, each fence consists of a set of observations that all originate from dierent targets. For a dimension of S ˃ the MTT problem becomes NP-hard. As of now no algorithm exists that can solve an NP-hard problem in an optimal manner within a reasonable (polynomial) computation time. However, there are algorithms that can approximate the solution with a realistic computational e ort. To this end an Elitist Genetic Algorithm is implemented to approximately solve the S ˃ MTT problem in an e cient manner. Its complexity is studied and it is found that an approximate solution can be obtained in a polynomial time. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the correlation and orbit determination problems simultaneously, and is able to e ciently process large data sets with minimal manual intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Any image processing object detection algorithm somehow tries to integrate the object light (Recognition Step) and applies statistical criteria to distinguish objects of interest from other objects or from pure background (Decision Step). There are various possibilities how these two basic steps can be realized, as can be seen in the different proposed detection methods in the literature. An ideal detection algorithm should provide high recognition sensitiv ity with high decision accuracy and require a reasonable computation effort . In reality, a gain in sensitivity is usually only possible with a loss in decision accuracy and with a higher computational effort. So, automatic detection of faint streaks is still a challenge. This paper presents a detection algorithm using spatial filters simulating the geometrical form of possible streaks on a CCD image. This is realized by image convolution. The goal of this method is to generate a more or less perfect match between a streak and a filter by varying the length and orientation of the filters. The convolution answers are accepted or rejected according to an overall threshold given by the ackground statistics. This approach yields as a first result a huge amount of accepted answers due to filters partially covering streaks or remaining stars. To avoid this, a set of additional acceptance criteria has been included in the detection method. All criteria parameters are justified by background and streak statistics and they affect the detection sensitivity only marginally. Tests on images containing simulated streaks and on real images containing satellite streaks show a very promising sensitivity, reliability and running speed for this detection method. Since all method parameters are based on statistics, the true alarm, as well as the false alarm probability, are well controllable. Moreover, the proposed method does not pose any extraordinary demands on the computer hardware and on the image acquisition process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several lake ice phenology studies from satellite data have been undertaken. However, the availability of long-term lake freeze-thaw-cycles, required to understand this proxy for climate variability and change, is scarce for European lakes. Long time series from space observations are limited to few satellite sensors. Data of the Advanced Very High Resolution Radiometer (AVHRR) are used in account of their unique potential as they offer each day global coverage from the early 1980s expectedly until 2022. An automatic two-step extraction was developed, which makes use of near-infrared reflectance values and thermal infrared derived lake surface water temperatures to extract lake ice phenology dates. In contrast to other studies utilizing thermal infrared, the thresholds are derived from the data itself, making it unnecessary to define arbitrary or lake specific thresholds. Two lakes in the Baltic region and a steppe lake on the Austrian–Hungarian border were selected. The later one was used to test the applicability of the approach to another climatic region for the time period 1990 to 2012. A comparison of the extracted event dates with in situ data provided good agreements of about 10 d mean absolute error. The two-step extraction was found to be applicable for European lakes in different climate regions and could fill existing data gaps in future applications. The extension of the time series to the full AVHRR record length (early 1980 until today) with adequate length for trend estimations would be of interest to assess climate variability and change. Furthermore, the two-step extraction itself is not sensor-specific and could be applied to other sensors with equivalent near- and thermal infrared spectral bands.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concentration of 11-nor-9-carboxy-Δ(9)-tetrahydrocannabinol (THCCOOH) in whole blood is used as a parameter for assessing the consumption behavior of cannabis consumers. The blood level of THCCOOH-glucuronide might provide additional information about the frequency of cannabis use. To verify this assumption, a column-switching liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the rapid and direct quantification of free and glucuronidated THCCOOH in human whole blood was newly developed. The method comprised protein precipitation, followed by injection of the processed sample onto a trapping column and subsequent gradient elution to an analytical column for separation and detection. The total LC run time was 4.5 min. Detection of the analytes was accomplished by electrospray ionization in positive ion mode and selected reaction monitoring using a triple-stage quadrupole mass spectrometer. The method was fully validated by evaluating the following parameters: linearity, lower limit of quantification, accuracy and imprecision, selectivity, extraction efficiency, matrix effect, carry-over, dilution integrity, analyte stability, and re-injection reproducibility. All acceptance criteria were analyzed and the predefined criteria met. Linearity ranged from 5.0 to 500 μg/L for both analytes. The method was successfully applied to whole blood samples from a large collective of cannabis consumers, demonstrating its applicability in the forensic field.