67 resultados para compressive sampling
Resumo:
During acts of physical aggression, offenders frequently come into contact with clothes of the victim, thereby leaving traces of DNA-bearing biological material on the garments. Since tape-lifting and swabbing, the currently established methods for non-destructive trace DNA sampling from clothing, both have their shortcomings in collection efficiency and handling, we thought about a new collection method for these challenging samples. Testing two readily available electrostatic devices for their potential to sample biological material from garments made of different fabrics, we found one of them, the electrostatic dust print lifter (DPL), to perform comparable to well-established sampling with wet cotton swabs. In simulated aggression scenarios, we had the same success rate for the establishment of single aggressor profiles, suitable for database submission, with both the DPL and wet swabbing. However, we lost a substantial amount of information with electrostatic sampling, since almost no mixed aggressor-victim profiles suitable for database entry could be established, compared to conventional swabbing. This study serves as a proof of principle for electrostatic DNA sampling from items of clothing. The technique still requires optimization before it might be used in real casework. But we are confident that in the future it could be an efficient and convenient contribution to the toolbox of forensic practitioners.
Resumo:
Monte Carlo integration is firmly established as the basis for most practical realistic image synthesis algorithms because of its flexibility and generality. However, the visual quality of rendered images often suffers from estimator variance, which appears as visually distracting noise. Adaptive sampling and reconstruction algorithms reduce variance by controlling the sampling density and aggregating samples in a reconstruction step, possibly over large image regions. In this paper we survey recent advances in this area. We distinguish between “a priori” methods that analyze the light transport equations and derive sampling rates and reconstruction filters from this analysis, and “a posteriori” methods that apply statistical techniques to sets of samples to drive the adaptive sampling and reconstruction process. They typically estimate the errors of several reconstruction filters, and select the best filter locally to minimize error. We discuss advantages and disadvantages of recent state-of-the-art techniques, and provide visual and quantitative comparisons. Some of these techniques are proving useful in real-world applications, and we aim to provide an overview for practitioners and researchers to assess these approaches. In addition, we discuss directions for potential further improvements.
Resumo:
With the ongoing shift in the computer graphics industry toward Monte Carlo rendering, there is a need for effective, practical noise-reduction techniques that are applicable to a wide range of rendering effects and easily integrated into existing production pipelines. This course surveys recent advances in image-space adaptive sampling and reconstruction algorithms for noise reduction, which have proven very effective at reducing the computational cost of Monte Carlo techniques in practice. These approaches leverage advanced image-filtering techniques with statistical methods for error estimation. They are attractive because they can be integrated easily into conventional Monte Carlo rendering frameworks, they are applicable to most rendering effects, and their computational overhead is modest.
Resumo:
The Interstellar Boundary Explorer (IBEX) has been directly observing neutral atoms from the local interstellar medium for the last six years (2009–2014). This paper ties together the 14 studies in this Astrophysical Journal Supplement Series Special Issue, which collectively describe the IBEX interstellar neutral results from this epoch and provide a number of other relevant theoretical and observational results. Interstellar neutrals interact with each other and with the ionized portion of the interstellar population in the “pristine” interstellar medium ahead of the heliosphere. Then, in the heliosphereʼs close vicinity, the interstellar medium begins to interact with escaping heliospheric neutrals. In this study, we compare the results from two major analysis approaches led by IBEX groups in New Hampshire and Warsaw. We also directly address the question of the distance upstream to the pristine interstellar medium and adjust both sets of results to a common distance of ~1000 AU. The two analysis approaches are quite different, but yield fully consistent measurements of the interstellar He flow properties, further validating our findings. While detailed error bars are given for both approaches, we recommend that for most purposes, the community use “working values” of ~25.4 km s⁻¹, ~75°7 ecliptic inflow longitude, ~−5°1 ecliptic inflow latitude, and ~7500 K temperature at ~1000 AU upstream. Finally, we briefly address future opportunities for even better interstellar neutral observations to be provided by the Interstellar Mapping and Acceleration Probe mission, which was recommended as the next major Heliophysics mission by the NRCʼs 2013 Decadal Survey.
Resumo:
In this paper, we simulate numerically the catastrophic disruption of a large asteroid as a result of a collision with a smaller projectile and the subsequent reaccumulation of fragments as a result of their mutual gravitational attractions. We then investigate the original location within the parent body of the small pieces that eventually reaccumulate to form the largest offspring of the disruption as a function of the internal structure of the parent body. We consider four cases that may represent the internal structure of such a body (whose diameter is fixed at 250 km) in various early stages of the Solar System evolution: fully molten, half molten (i.e., a 26 km-deep outer layer of melt containing half of the mass), solid except a thin molten layer (8 km thick) centered at 10 km depth, and fully solid. The solid material has properties of basalt. We then focus on the three largest offspring that have enough reaccumulated pieces to consider. Our results indicate that the particles that eventually reaccumulate to form the largest reaccumulated bodies retain a memory of their original locations in the parent body. Most particles in each reaccumulated body are clustered from the same original region, even if their reaccumulations take place far away. The extent of the original region varies considerably depending on the internal structure of the parent. It seems to shrink with the solidity of the body. The fraction of particles coming from a given depth is computed for the four cases, which can give constraints on the internal structure of parent bodies of some meteorites. As one example, we consider the ureilites, which in some petrogenetic models are inferred to have formed at particular depths within their parent body. (C) 2014 Elsevier Ltd. All rights reserved.