884 resultados para Dunkl Kernel


Relevância:

10.00% 10.00%

Publicador:

Resumo:

For two multinormal populations with equal covariance matrices the likelihood ratio discriminant function, an alternative allocation rule to the sample linear discriminant function when n1 ≠ n2 ,is studied analytically. With the assumption of a known covariance matrix its distribution is derived and the expectation of its actual and apparent error rates evaluated and compared with those of the sample linear discriminant function. This comparison indicates that the likelihood ratio allocation rule is robust to unequal sample sizes. The quadratic discriminant function is studied, its distribution reviewed and evaluation of its probabilities of misclassification discussed. For known covariance matrices the distribution of the sample quadratic discriminant function is derived. When the known covariance matrices are proportional exact expressions for the expectation of its actual and apparent error rates are obtained and evaluated. The effectiveness of the sample linear discriminant function for this case is also considered. Estimation of true log-odds for two multinormal populations with equal or unequal covariance matrices is studied. The estimative, Bayesian predictive and a kernel method are compared by evaluating their biases and mean square errors. Some algebraic expressions for these quantities are derived. With equal covariance matrices the predictive method is preferable. Where it derives this superiority is investigated by considering its performance for various levels of fixed true log-odds. It is also shown that the predictive method is sensitive to n1 ≠ n2. For unequal but proportional covariance matrices the unbiased estimative method is preferred. Product Normal kernel density estimates are used to give a kernel estimator of true log-odds. The effect of correlation in the variables with product kernels is considered. With equal covariance matrices the kernel and parametric estimators are compared by simulation. For moderately correlated variables and large dimension sizes the product kernel method is a good estimator of true log-odds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: X-ray computed tomography (CT) is widely used, both clinically and preclinically, for fast, high-resolution anatomic imaging; however, compelling opportunities exist to expand its use in functional imaging applications. For instance, spectral information combined with nanoparticle contrast agents enables quantification of tissue perfusion levels, while temporal information details cardiac and respiratory dynamics. The authors propose and demonstrate a projection acquisition and reconstruction strategy for 5D CT (3D+dual energy+time) which recovers spectral and temporal information without substantially increasing radiation dose or sampling time relative to anatomic imaging protocols. METHODS: The authors approach the 5D reconstruction problem within the framework of low-rank and sparse matrix decomposition. Unlike previous work on rank-sparsity constrained CT reconstruction, the authors establish an explicit rank-sparse signal model to describe the spectral and temporal dimensions. The spectral dimension is represented as a well-sampled time and energy averaged image plus regularly undersampled principal components describing the spectral contrast. The temporal dimension is represented as the same time and energy averaged reconstruction plus contiguous, spatially sparse, and irregularly sampled temporal contrast images. Using a nonlinear, image domain filtration approach, the authors refer to as rank-sparse kernel regression, the authors transfer image structure from the well-sampled time and energy averaged reconstruction to the spectral and temporal contrast images. This regularization strategy strictly constrains the reconstruction problem while approximately separating the temporal and spectral dimensions. Separability results in a highly compressed representation for the 5D data in which projections are shared between the temporal and spectral reconstruction subproblems, enabling substantial undersampling. The authors solved the 5D reconstruction problem using the split Bregman method and GPU-based implementations of backprojection, reprojection, and kernel regression. Using a preclinical mouse model, the authors apply the proposed algorithm to study myocardial injury following radiation treatment of breast cancer. RESULTS: Quantitative 5D simulations are performed using the MOBY mouse phantom. Twenty data sets (ten cardiac phases, two energies) are reconstructed with 88 μm, isotropic voxels from 450 total projections acquired over a single 360° rotation. In vivo 5D myocardial injury data sets acquired in two mice injected with gold and iodine nanoparticles are also reconstructed with 20 data sets per mouse using the same acquisition parameters (dose: ∼60 mGy). For both the simulations and the in vivo data, the reconstruction quality is sufficient to perform material decomposition into gold and iodine maps to localize the extent of myocardial injury (gold accumulation) and to measure cardiac functional metrics (vascular iodine). Their 5D CT imaging protocol represents a 95% reduction in radiation dose per cardiac phase and energy and a 40-fold decrease in projection sampling time relative to their standard imaging protocol. CONCLUSIONS: Their 5D CT data acquisition and reconstruction protocol efficiently exploits the rank-sparse nature of spectral and temporal CT data to provide high-fidelity reconstruction results without increased radiation dose or sampling time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a proactive approach to load sharing and describes the architecture of a scheme, Concert, based on this approach. A proactive approach is characterized by a shift of emphasis from reacting to load imbalance to avoiding its occurrence. In contrast, in a reactive load sharing scheme, activity is triggered when a processing node is either overloaded or underloaded. The main drawback of this approach is that a load imbalance is allowed to develop before costly corrective action is taken. Concert is a load sharing scheme for loosely-coupled distributed systems. Under this scheme, load and task behaviour information is collected and cached in advance of when it is needed. Concert uses Linux as a platform for development. Implemented partially in kernel space and partially in user space, it achieves transparency to users and applications whilst keeping the extent of kernel modifications to a minimum. Non-preemptive task transfers are used exclusively, motivated by lower complexity, lower overheads and faster transfers. The goal is to minimize the average response-time of tasks. Concert is compared with other schemes by considering the level of transparency it provides with respect to users, tasks and the underlying operating system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effect of different factors (spawning biomass, environmental conditions) on recruitment is a subject of great importance in the management of fisheries, recovery plans and scenario exploration. In this study, recently proposed supervised classification techniques, tested by the machine-learning community, are applied to forecast the recruitment of seven fish species of North East Atlantic (anchovy, sardine, mackerel, horse mackerel, hake, blue whiting and albacore), using spawning, environmental and climatic data. In addition, the use of the probabilistic flexible naive Bayes classifier (FNBC) is proposed as modelling approach in order to reduce uncertainty for fisheries management purposes. Those improvements aim is to improve probability estimations of each possible outcome (low, medium and high recruitment) based in kernel density estimation, which is crucial for informed management decision making with high uncertainty. Finally, a comparison between goodness-of-fit and generalization power is provided, in order to assess the reliability of the final forecasting models. It is found that in most cases the proposed methodology provides useful information for management whereas the case of horse mackerel is an example of the limitations of the approach. The proposed improvements allow for a better probabilistic estimation of the different scenarios, i.e. to reduce the uncertainty in the provided forecasts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El presente estudio tiene el objetivo de ofrecer una visión general sobre los Modelos de Desarrollo y Organización Territorial desde la perspectiva de las relaciones entre el medio urbano, “rururbano” y rural, en España. Para ello, tras conocer y valorar los enfoque conceptuales y temáticos, se estudia el crecimiento urbano en nuestro país en las últimas décadas, analizando de manera pormenorizada la importancia que ha cobrado y cobra la aplicación legislativa de leyes, planes y normas, tanto en el propio crecimiento urbano como en la demanda de viviendas en las ciudades españolas y, de igual modo, la vinculación de ambas con el precio de las viviendas, relacionándolo con la problemática de la “rururbanización”, y del medio rural.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Components of partial disease resistance (PDR) to fusarium head blight (FHB), detected in a seed-germination assay, were compared with whole-plant FHB resistance of 30 USA soft red winter wheat entries in the 2002 Uniform Southern FHB Nursery. Highly significant (P <0·001) differences between cultivars in the in vitro seed-germination assay inoculated with Microdochium majus were correlated to FHB disease incidence (r = -0·41; P <0·05), severity (r = -0·47; P <0·01), FHB index (r = -0·46; P <0·01), damaged kernels (r = -0·52; P <0·01), grain deoxynivalenol (DON) concentration (r = -0·40; P <0·05) and incidence/severity/kernel-damage index (ISK) (r = -0·45; P <0·01) caused by Fusarium graminearum. Multiple linear regression analysis explained a greater percentage of variation in FHB resistance using the seed-germination assay and the previously reported detached-leaf assay PDR components as explanatory factors. Shorter incubation periods, longer latent periods, shorter lesion lengths in the detached-leaf assay and higher germination rates in the seed-germination assay were related to greater FHB resistance across all disease variables, collectively explaining 62% of variation for incidence, 49% for severity, 56% for F. graminearum-damaged kernels (FDK), 39% for DON and 59% for ISK index. Incubation period was most strongly related to disease incidence and the early stages of infection, while resistance detected in the seed germination assay and latent period were more strongly related to FHB disease severity. Resistance detected using the seed-germination assay was notable as it related to greater decline in the level of FDK and a smaller reduction in DON than would have been expected from the reduction in FHB disease assessed by visual symptoms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract In the theory of central simple algebras, often we are dealing with abelian groups which arise from the kernel or co-kernel of functors which respect transfer maps (for example K-functors). Since a central simple algebra splits and the functors above are “trivial” in the split case, one can prove certain calculus on these functors. The common examples are kernel or co-kernel of the maps Ki(F)?Ki(D), where Ki are Quillen K-groups, D is a division algebra and F its center, or the homotopy fiber arising from the long exact sequence of above map, or the reduced Whitehead group SK1. In this note we introduce an abstract functor over the category of Azumaya algebras which covers all the functors mentioned above and prove the usual calculus for it. This, for example, immediately shows that K-theory of an Azumaya algebra over a local ring is “almost” the same as K-theory of the base ring. The main result is to prove that reduced K-theory of an Azumaya algebra over a Henselian ring coincides with reduced K-theory of its residue central simple algebra. The note ends with some calculation trying to determine the homotopy fibers mentioned above.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper builds on work presented in the first paper, Part 1 [1] and is of equal significance. The paper proposes a novel compensation method to preserve the integrity of step-fault signatures prevalent in various processes that can be masked during the removal of both auto- and cross correlation. Using industrial data, the paper demonstrates the benefit of the proposed method, which is applicable to chemical, electrical, and mechanical process monitoring. This paper, (and Part 1 [1]), has led to further work supported by EPSRC grant GR/S84354/01 involving kernel PCA methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is shown how the fractional probability density diffusion equation for the diffusion limit of one-dimensional continuous time random walks may be derived from a generalized Markovian Chapman-Kolmogorov equation. The non-Markovian behaviour is incorporated into the Markovian Chapman-Kolmogorov equation by postulating a Levy like distribution of waiting times as a kernel. The Chapman-Kolmogorov equation so generalised then takes on the form of a convolution integral. The dependence on the initial conditions typical of a non-Markovian process is treated by adding a time dependent term involving the survival probability to the convolution integral. In the diffusion limit these two assumptions about the past history of the process are sufficient to reproduce anomalous diffusion and relaxation behaviour of the Cole-Cole type. The Green function in the diffusion limit is calculated using the fact that the characteristic function is the Mittag-Leffler function. Fourier inversion of the characteristic function yields the Green function in terms of a Wright function. The moments of the distribution function are evaluated from the Mittag-Leffler function using the properties of characteristic functions and a relation between the powers of the second moment and higher order even moments is derived. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The identification of non-linear systems using only observed finite datasets has become a mature research area over the last two decades. A class of linear-in-the-parameter models with universal approximation capabilities have been intensively studied and widely used due to the availability of many linear-learning algorithms and their inherent convergence conditions. This article presents a systematic overview of basic research on model selection approaches for linear-in-the-parameter models. One of the fundamental problems in non-linear system identification is to find the minimal model with the best model generalisation performance from observational data only. The important concepts in achieving good model generalisation used in various non-linear system-identification algorithms are first reviewed, including Bayesian parameter regularisation and models selective criteria based on the cross validation and experimental design. A significant advance in machine learning has been the development of the support vector machine as a means for identifying kernel models based on the structural risk minimisation principle. The developments on the convex optimisation-based model construction algorithms including the support vector regression algorithms are outlined. Input selection algorithms and on-line system identification algorithms are also included in this review. Finally, some industrial applications of non-linear models are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High-cadence, multiwavelength optical observations of a solar active region (NOAA AR 10969), obtained with the Swedish Solar Telescope, are presented. Difference imaging of white light continuum data reveals a white-light brightening, 2 minutes in duration, linked to a cotemporal and cospatial C2.0 flare event. The flare kernel observed in the white-light images has a diameter of 300 km, thus rendering it below the resolution limit of most space-based telescopes. Continuum emission is present only during the impulsive stage of the flare, with the effects of chromospheric emission subsequently delayed by approximate to 2 minutes. The localized flare emission peaks at 300% above the quiescent flux. This large, yet tightly confined, increase in emission is only resolvable due to the high spatial resolution of the Swedish Solar Telescope. An investigation of the line-of-sight magnetic field derived from simultaneous MDI data shows that the continuum brightening is located very close to a magnetic polarity inversion line. In addition, an Ha flare ribbon is directed along a region of rapid magnetic energy change, with the footpoints of the ribbon remaining cospatial with the observed white-light brightening throughout the duration of the flare. The observed flare parameters are compared with current observations and theoretical models for M- and X-class events and we determine the observed white-light emission is caused by radiative back-warming. We suggest that the creation of white-light emission is a common feature of all solar flares.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En este estudio se evalúa el rendimiento de los métodos de Bag-of-Visualterms (BOV) para la clasificación automática de imágenes digitales de la base de datos del artista Miquel Planas. Estas imágenes intervienen en la ideación y diseño de su producción escultórica. Constituye un interesante desafío dada la dificultad de la categorización de escenas cuando éstas difieren más por los contenidos semánticos que por los objetos que contienen. Hemos empleado un método de reconocimiento basado en Kernels introducido por Lazebnik, Schmid y Ponce en 2006. Los resultados son prometedores, en promedio, la puntuación del rendimiento es aproximadamente del 70%. Los experimentos sugieren que la categorización automática de imágenes basada en métodos de visión artificial puede proporcionar principios objetivos en la catalogación de imágenes y que los resultados obtenidos pueden ser aplicados en diferentes campos de la creación artística.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Voice over IP (VoIP) has experienced a tremendous growth over the last few years and is now widely used among the population and for business purposes. The security of such VoIP systems is often assumed, creating a false sense of privacy. This paper investigates in detail the leakage of information from Skype, a widely used and protected VoIP application. Experiments have shown that isolated phonemes can be classified and given sentences identified. By using the dynamic time warping (DTW) algorithm, frequently used in speech processing, an accuracy of 60% can be reached. The results can be further improved by choosing specific training data and reach an accuracy of 83% under specific conditions. The initial results being speaker dependent, an approach involving the Kalman filter is proposed to extract the kernel of all training signals.