865 resultados para kernel estimator


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Para entender nuestro proyecto, debemos comprender DEVS. Dentro de los formalismos más populares de representación de sistemas de eventos discretos se encuentra DES. En la década de los 70, el matemático Bernard Zeigler propuso un formalismo general para la representación de dichos sistemas. Este formalismo denominado DEVS (Discrete EVent System Specification) es el formalismo más general para el tratamiento de DES. DEVS permite representar todos aquellos sistemas cuyo comportamiento pueda describirse mediante una secuencia de eventos discretos. Estos eventos se caracterizan por un tiempo base en el que solo un número de eventos finitos puede ocurrir. DEVS Modelado y Simulación tiene múltiples implementaciones en varios lenguajes de programación como por ejemplo en Java, C# o C++. Pero surge la necesidad de implementar una plataforma distribuida estable para proporcionar la mecánica de interoperabilidad e integrar modelos DEVS diversificados. En este proyecto, se nos dará como código base el core de xDEVS en java, aplicado de forma secuencial y paralelizada. Nuestro trabajo será implementar el core de manera distribuida de tal forma que se pueda dividir un sistema DEVS en diversas máquinas. Para esto hemos utilizado sockets de java para hacer la transmisión de datos lo más eficiente posible. En un principio deberemos especificar el número de máquinas que se conectarán al servidor. Una vez estas se hayan conectado se les enviará el trabajo específico que deberán simular. Cabe destacar que hay dos formas de dividir un sistema DEVS las cuales están implementadas en nuestro proyecto. La primera es dividirlo en módulos atómicos los cuales son subsistemas indivisibles en un sistema DEVS. Y la segunda es dividir las funciones de todos los subsistemas en grupos y repartirlos entre las máquinas. En resumen el funcionamiento de nuestro sistema distribuido será comenzar ejecutando el trabajo asignado al primer cliente, una vez finalizado actualizará la información del servidor y este mandara la orden al siguiente y así sucesivamente.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Virtually every sector of business and industry that uses computing, including financial analysis, search engines, and electronic commerce, incorporate Big Data analysis into their business model. Sophisticated clustering algorithms are popular for deducing the nature of data by assigning labels to unlabeled data. We address two main challenges in Big Data. First, by definition, the volume of Big Data is too large to be loaded into a computer’s memory (this volume changes based on the computer used or available, but there is always a data set that is too large for any computer). Second, in real-time applications, the velocity of new incoming data prevents historical data from being stored and future data from being accessed. Therefore, we propose our Streaming Kernel Fuzzy c-Means (stKFCM) algorithm, which reduces both computational complexity and space complexity significantly. The proposed stKFCM only requires O(n2) memory where n is the (predetermined) size of a data subset (or data chunk) at each time step, which makes this algorithm truly scalable (as n can be chosen based on the available memory). Furthermore, only 2n2 elements of the full N × N (where N >> n) kernel matrix need to be calculated at each time-step, thus reducing both the computation time in producing the kernel elements and also the complexity of the FCM algorithm. Empirical results show that stKFCM, even with relatively very small n, can provide clustering performance as accurately as kernel fuzzy c-means run on the entire data set while achieving a significant speedup.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis we study the heat kernel, a useful tool to analyze various properties of different quantum field theories. In particular, we focus on the study of the one-loop effective action and the application of worldline path integrals to derive perturbatively the heat kernel coefficients for the Proca theory of massive vector fields. It turns out that the worldline path integral method encounters some difficulties if the differential operator of the heat kernel is of non-minimal kind. More precisely, a direct recasting of the differential operator in terms of worldline path integrals, produces in the classical action a non-perturbative vertex and the path integral cannot be solved. In this work we wish to find ways to circumvent this issue and to give a suggestion to solve similar problems in other contexts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Shelled, roasted and salted cashew nut kernels were packaged in three different flexible materials (PP/PE= polypropylene / polyethylene; PETmet/PE= metallized polyethylene terephthalate / polyethylene; PET/Al/LDPE= polyethylene terephthalate / aluminum foil / low density polyethylene ), with different barrier properties. Kernels were stored for one year at 30° C and 80% relative humidity. Quantitative descriptive sensory analysis (QDA) were performed at the end of storage time. Descriptive terms obtained for kernels characterization were brown color, color uniformity and rugosity for appearance; toasted kernel, sweet, old and rancidity for odor; toasted kernel, sweet, old rancidity, salt and bitter for taste, crispness for texture. QDA showed that factors responsible for sensory quality decrease, after one year storage, were increase in old aroma and taste, increase in rancidity aroma and taste, decrease in roasted kernel aroma and taste, and decrease of crispness. Sensory quality decrease was higher in kernels packaged in PP/PE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objetivou-se identificar fatores associados ao edentulismo e o seu risco espacial em idosos. Foi realizado um estudo transversal em uma amostra de 372 indivíduos de 60 anos e mais, no Município de Botucatu, São Paulo, Brasil, em 2005. Razões de prevalência brutas e ajustadas foram estimadas por meio de regressão de Poisson, com estimativa robusta da variância e procedimentos de modelagem hierárquica. A análise espacial foi realizada por estimativas de densidade de Kernel. A prevalência de edentulismo foi de 63,17%. Os fatores sociodemográficos associados ao edentulismo foram a baixa escolaridade, o aumento do número de pessoas por cômodo, não possuir automóvel e idade mais avançada, presença de comorbidades, ausência de um cirurgião-dentista regular e ter realizado a última consulta há três anos ou mais. A análise espacial mostrou maior risco nas áreas periféricas. Obteve-se uma melhor compreensão da perda dentária entre os idosos, subsidiando o planejamento de ações em saúde coletiva.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A porção norte do domínio do Cerrado é uma das áreas historicamente menos conhecidas com relação à sua biodiversidade. Recentemente, alguns estudos tem revelado valores de riqueza comparáveis a outras regiões dentro do domínio. A Estação Ecológica Serra Geral do Tocantins (EESGT) está localizada na região do Jalapão, porção Nordeste do Cerrado, e faz parte do maior bloco de áreas protegidas neste domínio. Neste estudo descrevemos a riqueza e composição de espécies de anfíbios da EESGT, discutindo-as em um contexto biogeográfico, e caracterizamos o uso de sítios reprodutivos pelas espécies de anfíbios registradas em relação às fitofisionomias e aos tipos de corpos d'água. Utilizamos os métodos de busca ativa e armadilhas de queda, no período considerado como o auge da estação reprodutiva para a maior parte das espécies do Cerrado. Foram registradas 36 espécies de anfíbios na EESGT, totalizando 39 espécies conhecidas para a região do Jalapão. Aplicando o estimador Jackknife, sugerimos uma riqueza potencial de 42 espécies para a EESGT. A maior parte das espécies registradas é endêmica ou fortemente associada ao Cerrado, seguidas pelas espécies de ampla distribuição no Brasil ou América do Sul. A maior parte da espécies se reproduz em poças temporárias localizadas em áreas abertas, embora existam espécies que ocorrem exclusivamente em matas de galeria e utilizem corpos d'água lóticos para se reproduzir.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this study was to evaluate the agronomic characteristics, bromatological-chemical composition and digestibility of 11 corn cultivars (Zea mays) harvested at two cutting heights. Cultivars D 766, D 657, D 1000, P 3021, P 3041, C 805, C 333, AG 5011, FO 01, CO 9621 and BR 205 were evaluated when they were harvested 5 cm above ground (low) and 5 cm below the insertion of the first ear (high). The experiment was designed as random blocks, with three replicates, arranged in an 11 x 2 factorial scheme. Cultivars presented similar productions of forage dry matter and grains. Percentages of stalk, leaf, straw, cob and kernel fractions were different among cultivars, as well as dry matter content of the whole plant at harvest. Considering the whole plant, only the contents of gross energy, nitrogen in neutral detergent fiber, and in vitro neutral and acid detergent fiber digestibility did not differ among cultivars. Increase on the cutting height improved forage quality due to the reduction of stalk and leaf fractions and contents of cell wall constituents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract This paper aims at assessing the performance of a program of thermal simulation (Arquitrop) in different households in the city of Sao Paulo, Brazil. The households were selected for the Wheezing Project which followed up children under 2 years old to monitor the occurrence of respiratory diseases. The results show that in all three study households there is a good approximation between the observed and the simulated indoor temperatures. It was also observed a fairly consistent and realistic behavior between the simulated indoor and the outdoor temperatures, describing the Arquitrop model as an efficient estimator and good representative of the thermal behavior of households in the city of Sao Paulo. The worst simulation is linked to the poorest type of construction. This may be explained by the bad quality of the construction, which the Architrop could not simulate adequately

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main goal of this paper is to establish some equivalence results on stability, recurrence, and ergodicity between a piecewise deterministic Markov process ( PDMP) {X( t)} and an embedded discrete-time Markov chain {Theta(n)} generated by a Markov kernel G that can be explicitly characterized in terms of the three local characteristics of the PDMP, leading to tractable criterion results. First we establish some important results characterizing {Theta(n)} as a sampling of the PDMP {X( t)} and deriving a connection between the probability of the first return time to a set for the discrete-time Markov chains generated by G and the resolvent kernel R of the PDMP. From these results we obtain equivalence results regarding irreducibility, existence of sigma-finite invariant measures, and ( positive) recurrence and ( positive) Harris recurrence between {X( t)} and {Theta(n)}, generalizing the results of [ F. Dufour and O. L. V. Costa, SIAM J. Control Optim., 37 ( 1999), pp. 1483-1502] in several directions. Sufficient conditions in terms of a modified Foster-Lyapunov criterion are also presented to ensure positive Harris recurrence and ergodicity of the PDMP. We illustrate the use of these conditions by showing the ergodicity of a capacity expansion model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gaussianity and statistical isotropy of the Universe are modern cosmology's minimal set of hypotheses. In this work we introduce a new statistical test to detect observational deviations from this minimal set. By defining the temperature correlation function over the whole celestial sphere, we are able to independently quantify both angular and planar dependence (modulations) of the CMB temperature power spectrum over different slices of this sphere. Given that planar dependence leads to further modulations of the usual angular power spectrum C(l), this test can potentially reveal richer structures in the morphology of the primordial temperature field. We have also constructed an unbiased estimator for this angular-planar power spectrum which naturally generalizes the estimator for the usual C(l)'s. With the help of a chi-square analysis, we have used this estimator to search for observational deviations of statistical isotropy in WMAP's 5 year release data set (ILC5), where we found only slight anomalies on the angular scales l = 7 and l = 8. Since this angular-planar statistic is model-independent, it is ideal to employ in searches of statistical anisotropy (e.g., contaminations from the galactic plane) and to characterize non-Gaussianities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyze renormalizability properties of noncommutative (NC) theories with a bifermionic NC parameter. We introduce a new four-dimensional scalar field model which is renormalizable at all orders of the loop expansion. We show that this model has an infrared stable fixed point (at the one-loop level). We check that the NC QED (which is one-loop renormalizable with a usual NC parameter) remains renormalizable when the NC parameter is bifermionic, at least to the extent of one-loop diagrams with external photon legs. Our general conclusion is that bifermionic noncommutativity improves renormalizability properties of NC theories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Finite-size scaling analysis turns out to be a powerful tool to calculate the phase diagram as well as the critical properties of two-dimensional classical statistical mechanics models and quantum Hamiltonians in one dimension. The most used method to locate quantum critical points is the so-called crossing method, where the estimates are obtained by comparing the mass gaps of two distinct lattice sizes. The success of this method is due to its simplicity and the ability to provide accurate results even considering relatively small lattice sizes. In this paper, we introduce an estimator that locates quantum critical points by exploring the known distinct behavior of the entanglement entropy in critical and noncritical systems. As a benchmark test, we use this new estimator to locate the critical point of the quantum Ising chain and the critical line of the spin-1 Blume-Capel quantum chain. The tricritical point of this last model is also obtained. Comparison with the standard crossing method is also presented. The method we propose is simple to implement in practice, particularly in density matrix renormalization group calculations, and provides us, like the crossing method, amazingly accurate results for quite small lattice sizes. Our applications show that the proposed method has several advantages, as compared with the standard crossing method, and we believe it will become popular in future numerical studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: There are several studies in the literature depicting measurement error in gene expression data and also, several others about regulatory network models. However, only a little fraction describes a combination of measurement error in mathematical regulatory networks and shows how to identify these networks under different rates of noise. Results: This article investigates the effects of measurement error on the estimation of the parameters in regulatory networks. Simulation studies indicate that, in both time series (dependent) and non-time series (independent) data, the measurement error strongly affects the estimated parameters of the regulatory network models, biasing them as predicted by the theory. Moreover, when testing the parameters of the regulatory network models, p-values computed by ignoring the measurement error are not reliable, since the rate of false positives are not controlled under the null hypothesis. In order to overcome these problems, we present an improved version of the Ordinary Least Square estimator in independent (regression models) and dependent (autoregressive models) data when the variables are subject to noises. Moreover, measurement error estimation procedures for microarrays are also described. Simulation results also show that both corrected methods perform better than the standard ones (i.e., ignoring measurement error). The proposed methodologies are illustrated using microarray data from lung cancer patients and mouse liver time series data. Conclusions: Measurement error dangerously affects the identification of regulatory network models, thus, they must be reduced or taken into account in order to avoid erroneous conclusions. This could be one of the reasons for high biological false positive rates identified in actual regulatory network models.