866 resultados para Generalized Lebesgue Spaces
Resumo:
The problem of discovering frequent poly-regions (i.e. regions of high occurrence of a set of items or patterns of a given alphabet) in a sequence is studied, and three efficient approaches are proposed to solve it. The first one is entropy-based and applies a recursive segmentation technique that produces a set of candidate segments which may potentially lead to a poly-region. The key idea of the second approach is the use of a set of sliding windows over the sequence. Each sliding window covers a sequence segment and keeps a set of statistics that mainly include the number of occurrences of each item or pattern in that segment. Combining these statistics efficiently yields the complete set of poly-regions in the given sequence. The third approach applies a technique based on the majority vote, achieving linear running time with a minimal number of false negatives. After identifying the poly-regions, the sequence is converted to a sequence of labeled intervals (each one corresponding to a poly-region). An efficient algorithm for mining frequent arrangements of intervals is applied to the converted sequence to discover frequently occurring arrangements of poly-regions in different parts of DNA, including coding regions. The proposed algorithms are tested on various DNA sequences producing results of significant biological meaning.
Resumo:
In this paper, we introduce the Generalized Equality Classifier (GEC) for use as an unsupervised clustering algorithm in categorizing analog data. GEC is based on a formal definition of inexact equality originally developed for voting in fault tolerant software applications. GEC is defined using a metric space framework. The only parameter in GEC is a scalar threshold which defines the approximate equality of two patterns. Here, we compare the characteristics of GEC to the ART2-A algorithm (Carpenter, Grossberg, and Rosen, 1991). In particular, we show that GEC with the Hamming distance performs the same optimization as ART2. Moreover, GEC has lower computational requirements than AR12 on serial machines.
Resumo:
A dynamic distributed model is presented that reproduces the dynamics of a wide range of varied battle scenarios with a general and abstract representation. The model illustrates the rich dynamic behavior that can be achieved from a simple generic model.
Resumo:
Assuming that daily spot exchange rates follow a martingale process, we derive the implied time series process for the vector of 30-day forward rate forecast errors from using weekly data. The conditional second moment matrix of this vector is modelled as a multivariate generalized ARCH process. The estimated model is used to test the hypothesis that the risk premium is a linear function of the conditional variances and covariances as suggested by the standard asset pricing theory literature. Little supportt is found for this theory; instead lagged changes in the forward rate appear to be correlated with the 'risk premium.'. © 1990.
Resumo:
In this paper, we propose generalized sampling approaches for measuring a multi-dimensional object using a compact compound-eye imaging system called thin observation module by bound optics (TOMBO). This paper shows the proposed system model, physical examples, and simulations to verify TOMBO imaging using generalized sampling. In the system, an object is modulated and multiplied by a weight distribution with physical coding, and the coded optical signal is integrated on to a detector array. A numerical estimation algorithm employing a sparsity constraint is used for object reconstruction.
Resumo:
We consider the problem of variable selection in regression modeling in high-dimensional spaces where there is known structure among the covariates. This is an unconventional variable selection problem for two reasons: (1) The dimension of the covariate space is comparable, and often much larger, than the number of subjects in the study, and (2) the covariate space is highly structured, and in some cases it is desirable to incorporate this structural information in to the model building process. We approach this problem through the Bayesian variable selection framework, where we assume that the covariates lie on an undirected graph and formulate an Ising prior on the model space for incorporating structural information. Certain computational and statistical problems arise that are unique to such high-dimensional, structured settings, the most interesting being the phenomenon of phase transitions. We propose theoretical and computational schemes to mitigate these problems. We illustrate our methods on two different graph structures: the linear chain and the regular graph of degree k. Finally, we use our methods to study a specific application in genomics: the modeling of transcription factor binding sites in DNA sequences. © 2010 American Statistical Association.
Resumo:
It was shown in previous papers that the resolution of a confocal scanning microscope can be significantly improved by measuring, for each scanning position, the full diffraction image and by inverting these data to recover the value of the object at the confocal point. In the present work, the authors generalize the data inversion procedure by allowing, for reconstructing the object at a given point, to make use of the data samples recorded at other scanning positions. This leads them to a family of generalized inversion formulae, either exact or approximate. Some previously known formulae are re-derived here as special cases in a particularly simple way.
Resumo:
We present iterative algorithms for solving linear inverse problems with discrete data and compare their performances with the method of singular function expansion, in view of applications in optical imaging and particle sizing.
Resumo:
info:eu-repo/semantics/published
Resumo:
SCOPUS: er.j
Resumo:
The powerful general Pacala-Hassell host-parasitoid model for a patchy environment, which allows host density–dependent heterogeneity (HDD) to be distinguished from between-patch, host density–independent heterogeneity (HDI), is reformulated within the class of the generalized linear model (GLM) family. This improves accessibility through the provision of general software within well–known statistical systems, and allows a rich variety of models to be formulated. Covariates such as age class, host density and abiotic factors may be included easily. For the case where there is no HDI, the formulation is a simple GLM. When there is HDI in addition to HDD, the formulation is a hierarchical generalized linear model. Two forms of HDI model are considered, both with between-patch variability: one has binomial variation within patches and one has extra-binomial, overdispersed variation within patches. Examples are given demonstrating parameter estimation with standard errors, and hypothesis testing. For one example given, the extra-binomial component of the HDI heterogeneity in parasitism is itself shown to be strongly density dependent.
Resumo:
A class of generalized Lévy Laplacians which contain as a special case the ordinary Lévy Laplacian are considered. Topics such as limit average of the second order functional derivative with respect to a certain equally dense (uniformly bounded) orthonormal base, the relations with Kuo’s Fourier transform and other infinite dimensional Laplacians are studied.
Resumo:
A generalized Markov Brnching Process (GMBP) is a Markov branching model where the infinitesimal branching rates are modified with an interaction index. It is proved that there always exists only one GMBP. An associated differential-integral equation is derived. The extinction probalility and the mean and conditional mean extinction times are obtained. Ergodicity and stability of GMBP with resurrection are also considered. Easy checking criteria are established for ordinary and strong ergodicty. The equilibrium distribution is given in an elegant closed form. The probability meaning of our results is clear and thus explained.
Resumo:
In The Eye of Power, Foucault delineated the key concerns surrounding hospital architecture in the latter half of the eighteenth century as being the ‘visibility of bodies, individuals and things'. As such, the ‘new form of hospital' that came to be developed ‘was at once the effect and support of a new type of gaze'. This was a gaze that was not simply concerned with ways of minimising overcrowding or cross-contamination. Rather, this was a surveillance intended to produce knowledge about the pathological bodies contained within the hospital walls. This would then allow for their appropriate classification. Foucault went on to describe how these principles came to be applied to the architecture of prisons. This was exemplified for him in the distinct shape of Bentham's panopticon. This circular design, which has subsequently become an often misused synonym for a contemporary culture of surveillance, was premised on a binary of the seen and the not-seen. An individual observer could stand at the central point of the circle and observe the cells (and their occupants) on the perimeter whilst themselves remaining unseen. The panopticon in its purest form was never constructed, yet it conveys the significance of the production of knowledge through observation that became central to institutional design at this time and modern thought more broadly. What is curious though is that whilst the aim of those late eighteenth century buildings was to produce wellventilated spaces suffused with light, this provoked an interest in its opposite. The gothic movement in literature that was developing in parallel conversely took a ‘fantasy world of stone walls, darkness, hideouts and dungeons…' as its landscape (Vidler, 1992: 162). Curiously, despite these modern developments in prison design, the façade took on these characteristics. The gothic imagination came to describe that unseen world that lay behind the outer wall. This is what Evans refers to as an architectural ‘hoax'. The façade was taken to represent the world within the prison walls and it was the façade that came to inform the popular imagination about what occurred behind it. The rational, modern principles ordering the prison became conflated with the meanings projected by and onto the façade. This confusion of meanings have then been repeated and reenforced in the subsequent representations of the prison. This is of paramount importance since it is the cinematic and televisual representation of the prison, as I argue here and elsewhere, that maintain this erroneous set of meanings, this ‘hoax'.