971 resultados para order statistics


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Studies in turbulence often focus on two flow conditions, both of which occur frequently in real-world flows and are sought-after for their value in advancing turbulence theory. These are the high Reynolds number regime and the effect of wall surface roughness. In this dissertation, a Large-Eddy Simulation (LES) recreates both conditions over a wide range of Reynolds numbers Reτ = O(102)-O(108) and accounts for roughness by locally modeling the statistical effects of near-wall anisotropic fine scales in a thin layer immediately above the rough surface. A subgrid, roughness-corrected wall model is introduced to dynamically transmit this modeled information from the wall to the outer LES, which uses a stretched-vortex subgrid-scale model operating in the bulk of the flow. Of primary interest is the Reynolds number and roughness dependence of these flows in terms of first and second order statistics. The LES is first applied to a fully turbulent uniformly-smooth/rough channel flow to capture the flow dynamics over smooth, transitionally rough and fully rough regimes. Results include a Moody-like diagram for the wall averaged friction factor, believed to be the first of its kind obtained from LES. Confirmation is found for experimentally observed logarithmic behavior in the normalized stream-wise turbulent intensities. Tight logarithmic collapse, scaled on the wall friction velocity, is found for smooth-wall flows when Reτ ≥ O(106) and in fully rough cases. Since the wall model operates locally and dynamically, the framework is used to investigate non-uniform roughness distribution cases in a channel, where the flow adjustments to sudden surface changes are investigated. Recovery of mean quantities and turbulent statistics after transitions are discussed qualitatively and quantitatively at various roughness and Reynolds number levels. The internal boundary layer, which is defined as the border between the flow affected by the new surface condition and the unaffected part, is computed, and a collapse of the profiles on a length scale containing the logarithm of friction Reynolds number is presented. Finally, we turn to the possibility of expanding the present framework to accommodate more general geometries. As a first step, the whole LES framework is modified for use in the curvilinear geometry of a fully-developed turbulent pipe flow, with implementation carried out in a spectral element solver capable of handling complex wall profiles. The friction factors have shown favorable agreement with the superpipe data, and the LES estimates of the Karman constant and additive constant of the log-law closely match values obtained from experiment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The feedback coding problem for Gaussian systems in which the noise is neither white nor statistically independent between channels is formulated in terms of arbitrary linear codes at the transmitter and at the receiver. This new formulation is used to determine a number of feedback communication systems. In particular, the optimum linear code that satisfies an average power constraint on the transmitted signals is derived for a system with noiseless feedback and forward noise of arbitrary covariance. The noisy feedback problem is considered and signal sets for the forward and feedback channels are obtained with an average power constraint on each. The general formulation and results are valid for non-Gaussian systems in which the second order statistics are known, the results being applicable to the determination of error bounds via the Chebychev inequality.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Neste trabalho contemplamos o emprego de detectores de voz como uma etapa de pré- processamento de uma técnica de separação cega de sinais implementada no domínio do tempo, que emprega estatísticas de segunda ordem para a separação de misturas convolutivas e determinadas. Seu algoritmo foi adaptado para realizar a separação tanto em banda cheia quanto em sub-bandas, considerando a presença e a ausência de instantes de silêncio em misturas de sinais de voz. A ideia principal consiste em detectar trechos das misturas que contenham atividade de voz, evitando que o algoritmo de separação seja acionado na ausência de voz, promovendo ganho de desempenho e redução do custo computacional.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper deals with the experimental evaluation of a flow analysis system based on the integration between an under-resolved Navier-Stokes simulation and experimental measurements with the mechanism of feedback (referred to as Measurement-Integrated simulation), applied to the case of a planar turbulent co-flowing jet. The experiments are performed with inner-to-outer-jet velocity ratio around 2 and the Reynolds number based on the inner-jet heights about 10000. The measurement system is a high-speed PIV, which provides time-resolved data of the flow-field, on a field of view which extends to 20 jet heights downstream the jet outlet. The experimental data can thus be used both for providing the feedback data for the simulations and for validation of the MI-simulations over a wide region. The effect of reduced data-rate and spatial extent of the feedback (i.e. measurements are not available at each simulation time-step or discretization point) was investigated. At first simulations were run with full information in order to obtain an upper limit of the MI-simulations performance. The results show the potential of this methodology of reproducing first and second order statistics of the turbulent flow with good accuracy. Then, to deal with the reduced data different feedback strategies were tested. It was found that for small data-rate reduction the results are basically equivalent to the case of full-information feedback but as the feedback data-rate is reduced further the error increases and tend to be localized in regions of high turbulent activity. Moreover, it is found that the spatial distribution of the error looks qualitatively different for different feedback strategies. Feedback gain distributions calculated by optimal control theory are presented and proposed as a mean to make it possible to perform MI-simulations based on localized measurements only. So far, we have not been able to low error between measurements and simulations by using these gain distributions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present measurements of grid turbulence using 2D particle image velocimetry taken immediately downstream from the grid at a Reynolds number of Re M = 16500 where M is the rod spacing. A long field of view of 14M x 4M in the down- and cross-stream directions was achieved by stitching multiple cameras together. Two uniform biplanar grids were selected to have the same M and pressure drop but different rod diameter D and crosssection. A large data set (10 4 vector fields) was obtained to ensure good convergence of second-order statistics. Estimations of the dissipation rate ε of turbulent kinetic energy (TKE) were found to be sensitive to the number of meansquared velocity gradient terms included and not whether the turbulence was assumed to adhere to isotropy or axisymmetry. The resolution dependency of different turbulence statistics was assessed with a procedure that does not rely on the dissipation scale η. The streamwise evolution of the TKE components and ε was found to collapse across grids when the rod diameter was included in the normalisation. We argue that this should be the case between all regular grids when the other relevant dimensionless quantities are matched and the flow has become homogeneous across the stream. Two-point space correlation functions at x/M = 1 show evidence of complex wake interactions which exhibit a strong Reynolds number dependence. However, these changes in initial conditions disappear indicating rapid cross-stream homogenisation. On the other hand, isotropy was, as expected, not found to be established by x/M = 12 for any case studied. © Springer-Verlag 2012.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We consider a mobile sensor network monitoring a spatio-temporal field. Given limited cache sizes at the sensor nodes, the goal is to develop a distributed cache management algorithm to efficiently answer queries with a known probability distribution over the spatial dimension. First, we propose a novel distributed information theoretic approach in which the nodes locally update their caches based on full knowledge of the space-time distribution of the monitored phenomenon. At each time instant, local decisions are made at the mobile nodes concerning which samples to keep and whether or not a new sample should be acquired at the current location. These decisions account for minimizing an entropic utility function that captures the average amount of uncertainty in queries given the probability distribution of query locations. Second, we propose a different correlation-based technique, which only requires knowledge of the second-order statistics, thus relaxing the stringent constraint of having a priori knowledge of the query distribution, while significantly reducing the computational overhead. It is shown that the proposed approaches considerably improve the average field estimation error by maintaining efficient cache content. It is further shown that the correlation-based technique is robust to model mismatch in case of imperfect knowledge of the underlying generative correlation structure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The second-order statistics of neural activity was examined in a model of the cat LGN and V1 during free-viewing of natural images. In the model, the specific patterns of thalamocortical activity required for a Bebbian maturation of direction-selective cells in VI were found during the periods of visual fixation, when small eye movements occurred, but not when natural images were examined in the absence of fixational eye movements. In addition, simulations of stroboscopic reming that replicated the abnormal pattern of eye movements observed in kittens chronically exposed to stroboscopic illumination produced results consistent with the reported loss of direction selectivity and preservation of orientation selectivity. These results suggest the involvement of the oculomotor activity of visual fixation in the maturation of cortical direction selectivity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Investment funds provide a low cost method of sharing in the rewards from capitalism. Recently “alternative investments” such as hedge funds have grown rapidly and the trading strategies open to hedge funds are now becoming available to mutual funds and even to ordinary retail investors. In this paper we analyze problems in assessing fund performance and the prospects for investment fund sectors. Choosing genuine outperformers among top funds requires a careful assessment of non-normality, order statistics and the possibility of false discoveries. The risk adjusted performance of the average hedge fund over the last 10-15 is actually not that impressive, although the “top” funds do appear to have statistically significant positive alphas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Grey Level Co-occurrence Matrix (GLCM), one of the best known tool for texture analysis, estimates image properties related to second-order statistics. These image properties commonly known as Haralick texture features can be used for image classification, image segmentation, and remote sensing applications. However, their computations are highly intensive especially for very large images such as medical ones. Therefore, methods to accelerate their computations are highly desired. This paper proposes the use of programmable hardware to accelerate the calculation of GLCM and Haralick texture features. Further, as an example of the speedup offered by programmable logic, a multispectral computer vision system for automatic diagnosis of prostatic cancer has been implemented. The performance is then compared against a microprocessor based solution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we concentrate on the direct semi-blind spatial equalizer design for MIMO systems with Rayleigh fading channels. Our aim is to develop an algorithm which can outperform the classical training based method with the same training information used, and avoid the problems of low convergence speed and local minima due to pure blind methods. A general semi-blind cost function is first constructed which incorporates both the training information from the known data and some kind of higher order statistics (HOS) from the unknown sequence. Then, based on the developed cost function, we propose two semi-blind iterative and adaptive algorithms to find the desired spatial equalizer. To further improve the performance and convergence speed of the proposed adaptive method, we propose a technique to find the optimal choice of step size. Simulation results demonstrate the performance of the proposed algorithms and comparable schemes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Using seven strategically placed, time-synchronized bodyworn receivers covering the head, upper front and back torso, and the limbs, we have investigated the effect of user state: stationary or mobile and local environment: anechoic chamber, open office area and hallway upon first and second order statistics for on-body fading channels. Three candidate models were considered: Nakagami, Rice and lognormal. Using maximum likelihood estimation and the Akaike information criterion it was established that the Nakagami-m distribution best described small-scale fading for the majority of on-body channels over all the measurement scenarios. When the user was stationary, Nakagami-m parameters were found to be much greater than 1, irrespective of local surroundings. For mobile channels, Nakagami-m parameters significantly decreased, with channels in the open office area and hallway experiencing the worst fading conditions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Biosignal measurement and processing is increasingly being deployed in ambulatory situations particularly in connected health applications. Such an environment dramatically increases the likelihood of artifacts which can occlude features of interest and reduce the quality of information available in the signal. If multichannel recordings are available for a given signal source, then there are currently a considerable range of methods which can suppress or in some cases remove the distorting effect of such artifacts. There are, however, considerably fewer techniques available if only a single-channel measurement is available and yet single-channel measurements are important where minimal instrumentation complexity is required. This paper describes a novel artifact removal technique for use in such a context. The technique known as ensemble empirical mode decomposition with canonical correlation analysis (EEMD-CCA) is capable of operating on single-channel measurements. The EEMD technique is first used to decompose the single-channel signal into a multidimensional signal. The CCA technique is then employed to isolate the artifact components from the underlying signal using second-order statistics. The new technique is tested against the currently available wavelet denoising and EEMD-ICA techniques using both electroencephalography and functional near-infrared spectroscopy data and is shown to produce significantly improved results. © 1964-2012 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper contributes to and expands on the Nakagami-m phase model. It derives exact, closed-form expressions for both the phase cumulative distribution function and its inverse. In addition, empirical first- and second-order statistics obtained from measurements conducted in a body-area network scenario were used to fit the phase probability density function, the phase cumulative distribution function, and the phase crossing rate expressions. Remarkably, the unlikely shapes of the phase statistics, as predicted by the theoretical formulations, are actually encountered in practice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article describes a finite element-based formulation for the statistical analysis of the response of stochastic structural composite systems whose material properties are described by random fields. A first-order technique is used to obtain the second-order statistics for the structural response considering means and variances of the displacement and stress fields of plate or shell composite structures. Propagation of uncertainties depends on sensitivities taken as measurement of variation effects. The adjoint variable method is used to obtain the sensitivity matrix. This method is appropriated for composite structures due to the large number of random input parameters. Dominant effects on the stochastic characteristics are studied analyzing the influence of different random parameters. In particular, a study of the anisotropy influence on uncertainties propagation of angle-ply composites is carried out based on the proposed approach.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cette étude vise à tester la pertinence des images RSO - de moyenne et de haute résolution - à la caractérisation des types d’occupation du sol en milieu urbain. Elle s’est basée sur des approches texturales à partir des statistiques de deuxième ordre. Plus spécifiquement, on recherche les paramètres de texture les plus pertinents pour discriminer les objets urbains. Il a été utilisé à cet égard des images Radarsat-1 en mode fin en polarisation HH et Radarsat-2 en mode fin en double et quadruple polarisation et en mode ultrafin en polarisation HH. Les occupations du sol recherchées étaient le bâti dense, le bâti de densité moyenne, le bâti de densité faible, le bâti industriel et institutionnel, la végétation de faible densité, la végétation dense et l’eau. Les neuf paramètres de textures analysés ont été regroupés, en familles selon leur définition mathématique. Les paramètres de ressemblance/dissemblance regroupent l’Homogénéité, le Contraste, la Similarité et la Dissimilarité. Les paramètres de désordre sont l’Entropie et le Deuxième Moment Angulaire. L’Écart-Type et la Corrélation sont des paramètres de dispersion et la Moyenne est une famille à part. Il ressort des expériences que certaines combinaisons de paramètres de texture provenant de familles différentes utilisés dans les classifications donnent de très bons résultants alors que d’autres associations de paramètres de texture de définition mathématiques proches génèrent de moins bons résultats. Par ailleurs on constate que si l’utilisation de plusieurs paramètres de texture améliore les classifications, la performance de celle-ci plafonne à partir de trois paramètres. Malgré la bonne performance de cette approche basée sur la complémentarité des paramètres de texture, des erreurs systématiques dues aux effets cardinaux subsistent sur les classifications. Pour pallier à ce problème, il a été développé un modèle de compensation radiométrique basé sur la section efficace radar (SER). Une simulation radar à partir du modèle numérique de surface du milieu a permis d'extraire les zones de rétrodiffusion des bâtis et d'analyser les rétrodiffusions correspondantes. Une règle de compensation des effets cardinaux fondée uniquement sur les réponses des objets en fonction de leur orientation par rapport au plan d'illumination par le faisceau du radar a été mise au point. Des applications de cet algorithme sur des images RADARSAT-1 et RADARSAT-2 en polarisations HH, HV, VH, et VV ont permis de réaliser de considérables gains et d’éliminer l’essentiel des erreurs de classification dues aux effets cardinaux.