938 resultados para Sampling Theorem
Resumo:
It is well known that quantum correlations for bipartite dichotomic measurements are those of the form (Formula presented.), where the vectors ui and vj are in the unit ball of a real Hilbert space. In this work we study the probability of the nonlocal nature of these correlations as a function of (Formula presented.), where the previous vectors are sampled according to the Haar measure in the unit sphere of (Formula presented.). In particular, we prove the existence of an (Formula presented.) such that if (Formula presented.), (Formula presented.) is nonlocal with probability tending to 1 as (Formula presented.), while for (Formula presented.), (Formula presented.) is local with probability tending to 1 as (Formula presented.).
Resumo:
We present an algorithm to process images of reflected Placido rings captured by a commercial videokeratoscope. Raw data are obtained with no Cartesian-to-polar-coordinate conversion, thus avoiding interpolation and associated numerical artifacts. The method provides a characteristic equation for the device and is able to process around 6 times more corneal data than the commercial software. Our proposal allows complete control over the whole process from the capture of corneal images until the computation of curvature radii.
Resumo:
This correspondence presents an efficient method for reconstructing a band-limited signal in the discrete domain from its crossings with a sine wave. The method makes it possible to design A/D converters that only deliver the crossing timings, which are then used to interpolate the input signal at arbitrary instants. Potentially, it may allow for reductions in power consumption and complexity in these converters. The reconstruction in the discrete domain is based on a recently-proposed modification of the Lagrange interpolator, which is readily implementable with linear complexity and efficiently, given that it re-uses known schemes for variable fractional-delay (VFD) filters. As a spin-off, the method allows one to perform spectral analysis from sine wave crossings with the complexity of the FFT. Finally, the results in the correspondence are validated in several numerical examples.
Resumo:
Light traps have been used widely to sample insect abundance and diversity, but their performance for sampling scarab beetles in tropical forests based on light source type and sampling hours throughout the night has not been evaluated. The efficiency of mercury-vapour lamps, cool white light and ultraviolet light sources in attracting Dynastinae, Melolonthinae and Rutelinae scarab beetles, and the most adequate period of the night to carry out the sampling was tested in different forest areas of Costa Rica. Our results showed that light source wavelengths and hours of sampling influenced scarab beetle catches. No significant differences were observed in trap performance between the ultraviolet light and mercury-vapour traps, whereas these two methods caught significantly more species richness and abundance than cool white light traps. Species composition also varied between methods. Large differences appear between catches in the sampling period, with the first five hours of the night being more effective than the last five hours. Because of their high efficiency and logistic advantages, we recommend ultraviolet light traps deployed during the first hours of the night as the best sampling method for biodiversity studies of those scarab beetles in tropical forests.
Resumo:
This paper provides new versions of the Farkas lemma characterizing those inequalities of the form f(x) ≥ 0 which are consequences of a composite convex inequality (S ◦ g)(x) ≤ 0 on a closed convex subset of a given locally convex topological vector space X, where f is a proper lower semicontinuous convex function defined on X, S is an extended sublinear function, and g is a vector-valued S-convex function. In parallel, associated versions of a stable Farkas lemma, considering arbitrary linear perturbations of f, are also given. These new versions of the Farkas lemma, and their corresponding stable forms, are established under the weakest constraint qualification conditions (the so-called closedness conditions), and they are actually equivalent to each other, as well as equivalent to an extended version of the so-called Hahn–Banach–Lagrange theorem, and its stable version, correspondingly. It is shown that any of them implies analytic and algebraic versions of the Hahn–Banach theorem and the Mazur–Orlicz theorem for extended sublinear functions.
Resumo:
The choice of sampling methods to survey saproxylic beetles is a key aspect to assessing conservation strategies for one of the most endangered assemblages in Europe. We evaluated the efficiency of three sampling methods: baited tube traps (TT), window traps in front of a hollow opening (WT), and emergence traps covering tree hollows (ET) to study richness and diversity of saproxylic beetle assemblages at species and family levels in Mediterranean woodlands. We also examined trap efficiency to report ecological diversity, and changes in the relative richness and abundance of species forming trophic guilds: xylophagous, saprophagous/saproxylophagous, xylomycetophagous, predators and commensals. WT and ET were similarly effective in reporting species richness and diversity at species and family levels, and provided an accurate profile of both the flying active and hollow-linked saproxylic beetle assemblages. WT and ET were the most complementary methods, together reporting more than 90 % of richness and diversity at both species and family levels. Diversity, richness and abundance of guilds were better characterized by ET, which indicates higher efficiency in outlining the ecological community of saproxylics that inhabit tree hollows. TT were the least effective method at both taxonomic levels, sampling a biased portion of the beetle assemblage attracted to trapping principles, however they could be used as a specific method for families such as Bostrichiidae, Biphyllidae, Melyridae, Mycetophagidae or Curculionidae Scolytinae species. Finally, ET and WT combination allows a better characterization of saproxylic assemblages in Mediterranean woodland, by recording species with different biology and linked to different microhabitat types.
Resumo:
This note provides an approximate version of the Hahn–Banach theorem for non-necessarily convex extended-real valued positively homogeneous functions of degree one. Given p : X → R∪{+∞} such a function defined on the real vector space X, and a linear function defined on a subspace V of X and dominated by p (i.e. (x) ≤ p(x) for all x ∈ V), we say that can approximately be p-extended to X, if is the pointwise limit of a net of linear functions on V, every one of which can be extended to a linear function defined on X and dominated by p. The main result of this note proves that can approximately be p-extended to X if and only if is dominated by p∗∗, the pointwise supremum over the family of all the linear functions on X which are dominated by p.
Resumo:
For non-negative random variables with finite means we introduce an analogous of the equilibrium residual-lifetime distribution based on the quantile function. This allows us to construct new distributions with support (0, 1), and to obtain a new quantile-based version of the probabilistic generalization of Taylor's theorem. Similarly, for pairs of stochastically ordered random variables we come to a new quantile-based form of the probabilistic mean value theorem. The latter involves a distribution that generalizes the Lorenz curve. We investigate the special case of proportional quantile functions and apply the given results to various models based on classes of distributions and measures of risk theory. Motivated by some stochastic comparisons, we also introduce the “expected reversed proportional shortfall order”, and a new characterization of random lifetimes involving the reversed hazard rate function.
Resumo:
Since the beginning of 3D computer vision problems, the use of techniques to reduce the data to make it treatable preserving the important aspects of the scene has been necessary. Currently, with the new low-cost RGB-D sensors, which provide a stream of color and 3D data of approximately 30 frames per second, this is getting more relevance. Many applications make use of these sensors and need a preprocessing to downsample the data in order to either reduce the processing time or improve the data (e.g., reducing noise or enhancing the important features). In this paper, we present a comparison of different downsampling techniques which are based on different principles. Concretely, five different downsampling methods are included: a bilinear-based method, a normal-based, a color-based, a combination of the normal and color-based samplings, and a growing neural gas (GNG)-based approach. For the comparison, two different models have been used acquired with the Blensor software. Moreover, to evaluate the effect of the downsampling in a real application, a 3D non-rigid registration is performed with the data sampled. From the experimentation we can conclude that depending on the purpose of the application some kernels of the sampling methods can improve drastically the results. Bilinear- and GNG-based methods provide homogeneous point clouds, but color-based and normal-based provide datasets with higher density of points in areas with specific features. In the non-rigid application, if a color-based sampled point cloud is used, it is possible to properly register two datasets for cases where intensity data are relevant in the model and outperform the results if only a homogeneous sampling is used.
Resumo:
Sampling may promote prolonged engagement in sport by limiting physical injuries (Fraser-Thomas et al., 2005). Overtraining injuries are a concern for young athletes who specialize in one sport and engage in high volumes of deliberate practice (Hollander, Meyers, & Leunes, 1995; Law, Côté, & Ericsson, 2007). For instance, young gymnasts who practice for over 16 hours a week have been shown to have higher incidences of back injuries (Goldstein, Berger, Windier, & Jackson, 1991). A sampling approach in child-controlled play (e.g. deliberate play) rather than highly adult-controlled practice (e.g. deliberate practice) has been proposed as a strategy to limit overuse and other sport-related injuries (Micheli, Glassman, & Klein, 2000). In summary, sampling may protect against sport attrition by limiting sport related injuries and allowing children to have early experiences in sport that are enjoyable. Psychosocial Benefits of Sampling Only a small percentage of children who participate in school sports ever become elite athletes. Therefore, the psychosocial outcomes of sport participation are particularly important to consider. Recent studies with youth between the ages of 11 to 17 have found that those who are involved in a variety of extracurricular activities (e.g. sports, volunteer, arts) score more favourably on outcome measures such as Grade Point Average (GPA; Fredricks & Eccles, 2006a) and positive peer relationships (Fredricks & Eccles, 2006b) than youth who participate in fewer activities. These patterns are thought to exist due to each extracurricular activity bringing its own distinct pattern of socialization experiences that reinforce certain behaviours and/or teach various skills (Fredricks & Eccles, 2006b; Rose-Krasnor, Bussen, Willoughby, & Chambers, 2006). This contention is corroborated by studies of children and youths' experiences in extracurricular activities indicating that youth have unique experiences in each activity that contribute to their development (Hansen, Larson, & Dworkin, 2003; Larson, Hansen, & Moneta, 2006). This has led Wilkes and Côté (2007) to propose that children who sample different activities (through their own choice or by virtue of parental direction), have a greater chance of developing the following five developmental outcomes compared to children who specialize in one activity: 1) life skills, 2) prosocial behaviour, 3) healthy identity, 4) diverse peer groups and 5) social capital.
Resumo:
wgttest performs a test proposed by DuMouchel and Duncan (1983) to evaluate whether the weighted and unweighted estimates of a regression model are significantly different.