41 resultados para Free-space method
Resumo:
We present a novel approach to the reconstruction of depth from light field data. Our method uses dictionary representations and group sparsity constraints to derive a convex formulation. Although our solution results in an increase of the problem dimensionality, we keep numerical complexity at bay by restricting the space of solutions and by exploiting an efficient Primal-Dual formulation. Comparisons with state of the art techniques, on both synthetic and real data, show promising performances.
Resumo:
Background Complete-pelvis segmentation in antero-posterior pelvic radiographs is required to create a patient-specific three-dimensional pelvis model for surgical planning and postoperative assessment in image-free navigation of total hip arthroplasty. Methods A fast and robust framework for accurately segmenting the complete pelvis is presented, consisting of two consecutive modules. In the first module, a three-stage method was developed to delineate the left hemipelvis based on statistical appearance and shape models. To handle complex pelvic structures, anatomy-specific information processing techniques were employed. As the input to the second module, the delineated left hemi-pelvis was then reflected about an estimated symmetry line of the radiograph to initialize the right hemi-pelvis segmentation. The right hemi-pelvis was segmented by the same three-stage method, Results Two experiments conducted on respectively 143 and 40 AP radiographs demonstrated a mean segmentation accuracy of 1.61±0.68 mm. A clinical study to investigate the postoperative assessment of acetabular cup orientations based on the proposed framework revealed an average accuracy of 1.2°±0.9° and 1.6°±1.4° for anteversion and inclination, respectively. Delineation of each radiograph costs less than one minute. Conclusions Despite further validation needed, the preliminary results implied the underlying clinical applicability of the proposed framework for image-free THA.
Resumo:
Objectives: It has been repeatedly demonstrated that athletes in a state of ego depletion do not perform up to their capabilities in high pressure situations. We assume that momentarily available self-control strength determines whether individuals in high pressure situations can resist distracting stimuli. Design/method: In the present study, we applied a between-subjects design, as 31 experienced basketball players were randomly assigned to a depletion group or a non-depletion group. Participants performed 30 free throws while listening to statements representing worrisome thoughts (as frequently experienced in high pressure situations) over stereo headphones. Participants were instructed to block out these distracting audio messages and focus on the free throws. We postulated that depleted participants would be more likely to be distracted. They were also assumed to perform worse in the free throw task. Results: The results supported our assumption as depleted participants paid more attention to the distracting stimuli. In addition, they displayed worse performance in the free throw task. Conclusions: These results indicate that sufficient levels of self-control strength can serve as a buffer against distracting stimuli under pressure.
Resumo:
10.1002/hlca.19900730309.abs In three steps, 2-deoxy-D-ribose has been converted into a phosphoramidite building block bearing a (t-Bu)Me2Si protecting group at the OH function of the anomeric centre of the furanose ring. This building block was subsequently incorporated into DNA oligomers of various base sequences using the standard phosphoramidite protocol for automated DNA synthesis. The resulting silyl-oligomers have been purified by HPLC and selectively desilylated to the corresponding free apurinic DNA sequences. The hexamer d (A-A-A-A-X-A) (X representing the apurinic site) which was prepared in this way was characterized by 1H- and 31P-NMR spectroscopy. The other sequences as well as their fragments, which formed upon treatment with alkali base, were analyzed by polyacrylamide gel electrophoresis.
Resumo:
The population of space debris increased drastically during the last years. These objects have become a great threat for active satellites. Because the relative velocities between space debris and satellites are high, space debris objects may destroy active satellites through collisions. Furthermore, collisions involving massive objects produce large number of fragments leading to significant growth of the space debris population. The long term evolution of the debris population is essentially driven by so-called catastrophic collisions. An effective remediation measure in order to stabilize the population in Low Earth Orbit (LEO) is therefore the removal of large, massive space debris. To remove these objects, not only precise orbits, but also more detailed information about their attitude states will be required. One important property of an object targeted for removal is its spin period, spin axis orientation and their change over time. Rotating objects will produce periodic brightness variations with frequencies which are related to the spin periods. Such a brightness variation over time is called a light curve. Collecting, but also processing light curves is challenging due to several reasons. Light curves may be undersampled, low frequency components due to phase angle and atmospheric extinction changes may be present, and beat frequencies may occur when the rotation period is close to a multiple of the sampling period. Depending on the method which is used to extract the frequencies, also method-specific properties have to be taken into account. The astronomical Institute of the University of Bern (AIUB) light curve database will be introduced, which contains more than 1,300 light curves acquired over more than seven years. We will discuss properties and reliability of different time series analysis methods tested and currently used by AIUB for the light curve processing. Extracted frequencies and reconstructed phases for some interesting targets, e.g. GLONASS satellites, for which also SLR data were available for the period confirmation, will be presented. Finally we will present the reconstructed phase and its evolution over time of a High-Area-to-Mass-Ratio (HAMR) object, which AIUB observed for several years.
Resumo:
Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). In this context both the correct associations among the observations, and the orbits of the objects have to be determined. The complexity of the MTT problem is defined by its dimension S. Where S stands for the number of ’fences’ used in the problem, each fence consists of a set of observations that all originate from dierent targets. For a dimension of S ˃ the MTT problem becomes NP-hard. As of now no algorithm exists that can solve an NP-hard problem in an optimal manner within a reasonable (polynomial) computation time. However, there are algorithms that can approximate the solution with a realistic computational e ort. To this end an Elitist Genetic Algorithm is implemented to approximately solve the S ˃ MTT problem in an e cient manner. Its complexity is studied and it is found that an approximate solution can be obtained in a polynomial time. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the correlation and orbit determination problems simultaneously, and is able to e ciently process large data sets with minimal manual intervention.
Resumo:
Any image processing object detection algorithm somehow tries to integrate the object light (Recognition Step) and applies statistical criteria to distinguish objects of interest from other objects or from pure background (Decision Step). There are various possibilities how these two basic steps can be realized, as can be seen in the different proposed detection methods in the literature. An ideal detection algorithm should provide high recognition sensitiv ity with high decision accuracy and require a reasonable computation effort . In reality, a gain in sensitivity is usually only possible with a loss in decision accuracy and with a higher computational effort. So, automatic detection of faint streaks is still a challenge. This paper presents a detection algorithm using spatial filters simulating the geometrical form of possible streaks on a CCD image. This is realized by image convolution. The goal of this method is to generate a more or less perfect match between a streak and a filter by varying the length and orientation of the filters. The convolution answers are accepted or rejected according to an overall threshold given by the ackground statistics. This approach yields as a first result a huge amount of accepted answers due to filters partially covering streaks or remaining stars. To avoid this, a set of additional acceptance criteria has been included in the detection method. All criteria parameters are justified by background and streak statistics and they affect the detection sensitivity only marginally. Tests on images containing simulated streaks and on real images containing satellite streaks show a very promising sensitivity, reliability and running speed for this detection method. Since all method parameters are based on statistics, the true alarm, as well as the false alarm probability, are well controllable. Moreover, the proposed method does not pose any extraordinary demands on the computer hardware and on the image acquisition process.
Resumo:
Several lake ice phenology studies from satellite data have been undertaken. However, the availability of long-term lake freeze-thaw-cycles, required to understand this proxy for climate variability and change, is scarce for European lakes. Long time series from space observations are limited to few satellite sensors. Data of the Advanced Very High Resolution Radiometer (AVHRR) are used in account of their unique potential as they offer each day global coverage from the early 1980s expectedly until 2022. An automatic two-step extraction was developed, which makes use of near-infrared reflectance values and thermal infrared derived lake surface water temperatures to extract lake ice phenology dates. In contrast to other studies utilizing thermal infrared, the thresholds are derived from the data itself, making it unnecessary to define arbitrary or lake specific thresholds. Two lakes in the Baltic region and a steppe lake on the Austrian–Hungarian border were selected. The later one was used to test the applicability of the approach to another climatic region for the time period 1990 to 2012. A comparison of the extracted event dates with in situ data provided good agreements of about 10 d mean absolute error. The two-step extraction was found to be applicable for European lakes in different climate regions and could fill existing data gaps in future applications. The extension of the time series to the full AVHRR record length (early 1980 until today) with adequate length for trend estimations would be of interest to assess climate variability and change. Furthermore, the two-step extraction itself is not sensor-specific and could be applied to other sensors with equivalent near- and thermal infrared spectral bands.
Resumo:
The concentration of 11-nor-9-carboxy-Δ(9)-tetrahydrocannabinol (THCCOOH) in whole blood is used as a parameter for assessing the consumption behavior of cannabis consumers. The blood level of THCCOOH-glucuronide might provide additional information about the frequency of cannabis use. To verify this assumption, a column-switching liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the rapid and direct quantification of free and glucuronidated THCCOOH in human whole blood was newly developed. The method comprised protein precipitation, followed by injection of the processed sample onto a trapping column and subsequent gradient elution to an analytical column for separation and detection. The total LC run time was 4.5 min. Detection of the analytes was accomplished by electrospray ionization in positive ion mode and selected reaction monitoring using a triple-stage quadrupole mass spectrometer. The method was fully validated by evaluating the following parameters: linearity, lower limit of quantification, accuracy and imprecision, selectivity, extraction efficiency, matrix effect, carry-over, dilution integrity, analyte stability, and re-injection reproducibility. All acceptance criteria were analyzed and the predefined criteria met. Linearity ranged from 5.0 to 500 μg/L for both analytes. The method was successfully applied to whole blood samples from a large collective of cannabis consumers, demonstrating its applicability in the forensic field.
Resumo:
Asynchronous level crossing sampling analog-to-digital converters (ADCs) are known to be more energy efficient and produce fewer samples than their equidistantly sampling counterparts. However, as the required threshold voltage is lowered, the number of samples and, in turn, the data rate and the energy consumed by the overall system increases. In this paper, we present a cubic Hermitian vector-based technique for online compression of asynchronously sampled electrocardiogram signals. The proposed method is computationally efficient data compression. The algorithm has complexity O(n), thus well suited for asynchronous ADCs. Our algorithm requires no data buffering, maintaining the energy advantage of asynchronous ADCs. The proposed method of compression has a compression ratio of up to 90% with achievable percentage root-mean-square difference ratios as a low as 0.97. The algorithm preserves the superior feature-to-feature timing accuracy of asynchronously sampled signals. These advantages are achieved in a computationally efficient manner since algorithm boundary parameters for the signals are extracted a priori.
Resumo:
We show the existence of free dense subgroups, generated by two elements, in the holomorphic shear and overshear group of complex-Euclidean space and extend this result to the group of holomorphic automorphisms of Stein manifolds with the density property, provided there exists a generalized translation. The conjugation operator associated to this generalized translation is hypercyclic on the topological space of holomorphic automorphisms.