971 resultados para ORDER-STATISTICS


Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Little is known about the population's exposure to radio frequency electromagnetic fields (RF-EMF) in industrialized countries. OBJECTIVES: To examine levels of exposure and the importance of different RF-EMF sources and settings in a sample of volunteers living in a Swiss city. METHODS: RF-EMF exposure of 166 volunteers from Basel, Switzerland, was measured with personal exposure meters (exposimeters). Participants carried an exposimeter for 1 week (two separate weeks in 32 participants) and completed an activity diary. Mean values were calculated using the robust regression on order statistics (ROS) method. RESULTS: Mean weekly exposure to all RF-EMF sources was 0.13 mW/m(2) (0.22 V/m) (range of individual means 0.014-0.881 mW/m(2)). Exposure was mainly due to mobile phone base stations (32.0%), mobile phone handsets (29.1%) and digital enhanced cordless telecommunications (DECT) phones (22.7%). Persons owning a DECT phone (total mean 0.15 mW/m(2)) or mobile phone (0.14 mW/m(2)) were exposed more than those not owning a DECT or mobile phone (0.10 mW/m(2)). Mean values were highest in trains (1.16 mW/m(2)), airports (0.74 mW/m(2)) and tramways or buses (0.36 mW/m(2)), and higher during daytime (0.16 mW/m(2)) than nighttime (0.08 mW/m(2)). The Spearman correlation coefficient between mean exposure in the first and second week was 0.61. CONCLUSIONS: Exposure to RF-EMF varied considerably between persons and locations but was fairly consistent within persons. Mobile phone handsets, mobile phone base stations and cordless phones were important sources of exposure in urban Switzerland.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aim of this paper is to evaluate the diagnostic contribution of various types of texture features in discrimination of hepatic tissue in abdominal non-enhanced Computed Tomography (CT) images. Regions of Interest (ROIs) corresponding to the classes: normal liver, cyst, hemangioma, and hepatocellular carcinoma were drawn by an experienced radiologist. For each ROI, five distinct sets of texture features are extracted using First Order Statistics (FOS), Spatial Gray Level Dependence Matrix (SGLDM), Gray Level Difference Method (GLDM), Laws' Texture Energy Measures (TEM), and Fractal Dimension Measurements (FDM). In order to evaluate the ability of the texture features to discriminate the various types of hepatic tissue, each set of texture features, or its reduced version after genetic algorithm based feature selection, was fed to a feed-forward Neural Network (NN) classifier. For each NN, the area under Receiver Operating Characteristic (ROC) curves (Az) was calculated for all one-vs-all discriminations of hepatic tissue. Additionally, the total Az for the multi-class discrimination task was estimated. The results show that features derived from FOS perform better than other texture features (total Az: 0.802+/-0.083) in the discrimination of hepatic tissue.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La evaluación de la seguridad de estructuras antiguas de fábrica es un problema abierto.El material es heterogéneo y anisótropo, el estado previo de tensiones difícil de conocer y las condiciones de contorno inciertas. A comienzos de los años 50 se demostró que el análisis límite era aplicable a este tipo de estructuras, considerándose desde entonces como una herramienta adecuada. En los casos en los que no se produce deslizamiento la aplicación de los teoremas del análisis límite estándar constituye una herramienta formidable por su simplicidad y robustez. No es necesario conocer el estado real de tensiones. Basta con encontrar cualquier solución de equilibrio, y que satisfaga las condiciones de límite del material, en la seguridad de que su carga será igual o inferior a la carga real de inicio de colapso. Además esta carga de inicio de colapso es única (teorema de la unicidad) y se puede obtener como el óptimo de uno cualquiera entre un par de programas matemáticos convexos duales. Sin embargo, cuando puedan existir mecanismos de inicio de colapso que impliquen deslizamientos, cualquier solución debe satisfacer tanto las restricciones estáticas como las cinemáticas, así como un tipo especial de restricciones disyuntivas que ligan las anteriores y que pueden plantearse como de complementariedad. En este último caso no está asegurada la existencia de una solución única, por lo que es necesaria la búsqueda de otros métodos para tratar la incertidumbre asociada a su multiplicidad. En los últimos años, la investigación se ha centrado en la búsqueda de un mínimo absoluto por debajo del cual el colapso sea imposible. Este método es fácil de plantear desde el punto de vista matemático, pero intratable computacionalmente, debido a las restricciones de complementariedad 0 y z 0 que no son ni convexas ni suaves. El problema de decisión resultante es de complejidad computacional No determinista Polinomial (NP)- completo y el problema de optimización global NP-difícil. A pesar de ello, obtener una solución (sin garantía de exito) es un problema asequible. La presente tesis propone resolver el problema mediante Programación Lineal Secuencial, aprovechando las especiales características de las restricciones de complementariedad, que escritas en forma bilineal son del tipo y z = 0; y 0; z 0 , y aprovechando que el error de complementariedad (en forma bilineal) es una función de penalización exacta. Pero cuando se trata de encontrar la peor solución, el problema de optimización global equivalente es intratable (NP-difícil). Además, en tanto no se demuestre la existencia de un principio de máximo o mínimo, existe la duda de que el esfuerzo empleado en aproximar este mínimo esté justificado. En el capítulo 5, se propone hallar la distribución de frecuencias del factor de carga, para todas las soluciones de inicio de colapso posibles, sobre un sencillo ejemplo. Para ello, se realiza un muestreo de soluciones mediante el método de Monte Carlo, utilizando como contraste un método exacto de computación de politopos. El objetivo final es plantear hasta que punto está justificada la busqueda del mínimo absoluto y proponer un método alternativo de evaluación de la seguridad basado en probabilidades. Las distribuciones de frecuencias, de los factores de carga correspondientes a las soluciones de inicio de colapso obtenidas para el caso estudiado, muestran que tanto el valor máximo como el mínimo de los factores de carga son muy infrecuentes, y tanto más, cuanto más perfecto y contínuo es el contacto. Los resultados obtenidos confirman el interés de desarrollar nuevos métodos probabilistas. En el capítulo 6, se propone un método de este tipo basado en la obtención de múltiples soluciones, desde puntos de partida aleatorios y calificando los resultados mediante la Estadística de Orden. El propósito es determinar la probabilidad de inicio de colapso para cada solución.El método se aplica (de acuerdo a la reducción de expectativas propuesta por la Optimización Ordinal) para obtener una solución que se encuentre en un porcentaje determinado de las peores. Finalmente, en el capítulo 7, se proponen métodos híbridos, incorporando metaheurísticas, para los casos en que la búsqueda del mínimo global esté justificada. Abstract Safety assessment of the historic masonry structures is an open problem. The material is heterogeneous and anisotropic, the previous state of stress is hard to know and the boundary conditions are uncertain. In the early 50's it was proven that limit analysis was applicable to this kind of structures, being considered a suitable tool since then. In cases where no slip occurs, the application of the standard limit analysis theorems constitutes an excellent tool due to its simplicity and robustness. It is enough find any equilibrium solution which satisfy the limit constraints of the material. As we are certain that this load will be equal to or less than the actual load of the onset of collapse, it is not necessary to know the actual stresses state. Furthermore this load for the onset of collapse is unique (uniqueness theorem), and it can be obtained as the optimal from any of two mathematical convex duals programs However, if the mechanisms of the onset of collapse involve sliding, any solution must satisfy both static and kinematic constraints, and also a special kind of disjunctive constraints linking the previous ones, which can be formulated as complementarity constraints. In the latter case, it is not guaranted the existence of a single solution, so it is necessary to look for other ways to treat the uncertainty associated with its multiplicity. In recent years, research has been focused on finding an absolute minimum below which collapse is impossible. This method is easy to set from a mathematical point of view, but computationally intractable. This is due to the complementarity constraints 0 y z 0 , which are neither convex nor smooth. The computational complexity of the resulting decision problem is "Not-deterministic Polynomialcomplete" (NP-complete), and the corresponding global optimization problem is NP-hard. However, obtaining a solution (success is not guaranteed) is an affordable problem. This thesis proposes solve that problem through Successive Linear Programming: taking advantage of the special characteristics of complementarity constraints, which written in bilinear form are y z = 0; y 0; z 0 ; and taking advantage of the fact that the complementarity error (bilinear form) is an exact penalty function. But when it comes to finding the worst solution, the (equivalent) global optimization problem is intractable (NP-hard). Furthermore, until a minimum or maximum principle is not demonstrated, it is questionable that the effort expended in approximating this minimum is justified. XIV In chapter 5, it is proposed find the frequency distribution of the load factor, for all possible solutions of the onset of collapse, on a simple example. For this purpose, a Monte Carlo sampling of solutions is performed using a contrast method "exact computation of polytopes". The ultimate goal is to determine to which extent the search of the global minimum is justified, and to propose an alternative approach to safety assessment based on probabilities. The frequency distributions for the case study show that both the maximum and the minimum load factors are very infrequent, especially when the contact gets more perfect and more continuous. The results indicates the interest of developing new probabilistic methods. In Chapter 6, is proposed a method based on multiple solutions obtained from random starting points, and qualifying the results through Order Statistics. The purpose is to determine the probability for each solution of the onset of collapse. The method is applied (according to expectations reduction given by the Ordinal Optimization) to obtain a solution that is in a certain percentage of the worst. Finally, in Chapter 7, hybrid methods incorporating metaheuristics are proposed for cases in which the search for the global minimum is justified.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Atmospheric propagation at frequencies within the THz domain are deeply affected by the influence of the composition and phenomena of the troposphere. This paper is focused on the estimation of first order statistics of total attenuation under non-rainy conditions at 100 GHz. With this purpose, a yearly meteorological database from Madrid, including radiosoundings, SYNOP observations and co-site rain gauge, have been used in order to calculate attenuation due to atmospheric gases and clouds, as well as to introduce and evaluate a rain detection method. This method allows to filter out rain events and refine the statistics of total attenuation under the scenarios under study. It is expected that the behavior of the statistics would be closest to the ones obtained by experimental techniques under similar conditions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The different theoretical models related with storm wave characterization focus on determining the significant wave height of the peak storm, the mean period and, usually assuming a triangle storm shape, their duration. In some cases, the main direction is also considered. Nevertheless, definition of the whole storm history, including the variation of the main random variables during the storm cycle is not taken into consideration. The representativeness of the proposed storm models, analysed in a recent study using an empirical maximum energy flux time dependent function shows that the behaviour of the different storm models is extremely dependent on the climatic characteristics of the project area. Moreover, there are no theoretical models able to adequately reproduce storm history evolution of the sea states characterized by important swell components. To overcome this shortcoming, several theoretical storm shapes are investigated taking into consideration the bases of the three best theoretical storm models, the Equivalent Magnitude Storm (EMS), the Equivalent Number of Waves Storm (ENWS) and the Equivalent Duration Storm (EDS) models. To analyse the representativeness of the new storm shape, the aforementioned maximum energy flux formulation and a wave overtopping discharge structure function are used. With the empirical energy flux formulation, correctness of the different approaches is focussed on the progressive hydraulic stability loss of the main armour layer caused by real and theoretical storms. For the overtopping structure equation, the total volume of discharge is considered. In all cases, the results obtained highlight the greater representativeness of the triangular EMS model for sea waves and the trapezoidal (nonparallel sides) EMS model for waves with a higher degree of wave development. Taking into account the increase in offshore and shallow water wind turbines, maritime transport and deep vertical breakwaters, the maximum wave height of the whole storm history and that corresponding to each sea state belonging to its cycle's evolution is also considered. The procedure considers the information usually available for extreme waves' characterization. Extrapolations of the maximum wave height of the selected storms have also been considered. The 4th order statistics of the sea state belonging to the real and theoretical storm have been estimated to complete the statistical analysis of individual wave height

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Several cases have been described in the literature where genetic polymorphism appears to be shared between a pair of species. Here we examine the distribution of times to random loss of shared polymorphism in the context of the neutral Wright–Fisher model. Order statistics are used to obtain the distribution of times to loss of a shared polymorphism based on Kimura’s solution to the diffusion approximation of the Wright–Fisher model. In a single species, the expected absorption time for a neutral allele having an initial allele frequency of ½ is 2.77 N generations. If two species initially share a polymorphism, that shared polymorphism is lost as soon as either of two species undergoes fixation. The loss of a shared polymorphism thus occurs sooner than loss of polymorphism in a single species and has an expected time of 1.7 N generations. Molecular sequences of genes with shared polymorphism may be characterized by the count of the number of sites that segregate in both species for the same nucleotides (or amino acids). The distribution of the expected numbers of these shared polymorphic sites also is obtained. Shared polymorphism appears to be more likely at genetic loci that have an unusually large number of segregating alleles, and the neutral coalescent proves to be very useful in determining the probability of shared allelic lineages expected by chance. These results are related to examples of shared polymorphism in the literature.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A method is given for determining the time course and spatial extent of consistently and transiently task-related activations from other physiological and artifactual components that contribute to functional MRI (fMRI) recordings. Independent component analysis (ICA) was used to analyze two fMRI data sets from a subject performing 6-min trials composed of alternating 40-sec Stroop color-naming and control task blocks. Each component consisted of a fixed three-dimensional spatial distribution of brain voxel values (a “map”) and an associated time course of activation. For each trial, the algorithm detected, without a priori knowledge of their spatial or temporal structure, one consistently task-related component activated during each Stroop task block, plus several transiently task-related components activated at the onset of one or two of the Stroop task blocks only. Activation patterns occurring during only part of the fMRI trial are not observed with other techniques, because their time courses cannot easily be known in advance. Other ICA components were related to physiological pulsations, head movements, or machine noise. By using higher-order statistics to specify stricter criteria for spatial independence between component maps, ICA produced improved estimates of the temporal and spatial extent of task-related activation in our data compared with principal component analysis (PCA). ICA appears to be a promising tool for exploratory analysis of fMRI data, particularly when the time courses of activation are not known in advance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work deals with the random free vibration of functionally graded laminates with general boundary conditions and subjected to a temperature change, taking into account the randomness in a number of independent input variables such as Young's modulus, Poisson's ratio and thermal expansion coefficient of each constituent material. Based on third-order shear deformation theory, the mixed-type formulation and a semi-analytical approach are employed to derive the standard eigenvalue problem in terms of deflection, mid-plane rotations and stress function. A mean-centered first-order perturbation technique is adopted to obtain the second-order statistics of vibration frequencies. A detailed parametric study is conducted, and extensive numerical results are presented in both tabular and graphical forms for laminated plates that contain functionally graded material which is made of aluminum and zirconia, showing the effects of scattering in thermo-clastic material constants, temperature change, edge support condition, side-to-thickness ratio, and plate aspect ratio on the stochastic characteristics of natural frequencies. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A solar power satellite is paid attention to as a clean, inexhaustible large- scale base-load power supply. The following technology related to beam control is used: A pilot signal is sent from the power receiving site and after direction of arrival estimation the beam is directed back to the earth by same direction. A novel direction-finding algorithm based on linear prediction technique for exploiting cyclostationary statistical information (spatial and temporal) is explored. Many modulated communication signals exhibit a cyclostationarity (or periodic correlation) property, corresponding to the underlying periodicity arising from carrier frequencies or baud rates. The problem was solved by using both cyclic second-order statistics and cyclic higher-order statistics. By evaluating the corresponding cyclic statistics of the received data at certain cycle frequencies, we can extract the cyclic correlations of only signals with the same cycle frequency and null out the cyclic correlations of stationary additive noise and all other co-channel interferences with different cycle frequencies. Thus, the signal detection capability can be significantly improved. The proposed algorithms employ cyclic higher-order statistics of the array output and suppress additive Gaussian noise of unknown spectral content, even when the noise shares common cycle frequencies with the non-Gaussian signals of interest. The proposed method completely exploits temporal information (multiple lag ), and also can correctly estimate direction of arrival of desired signals by suppressing undesired signals. Our approach was generalized over direction of arrival estimation of cyclostationary coherent signals. In this paper, we propose a new approach for exploiting cyclostationarity that seems to be more advanced in comparison with the other existing direction finding algorithms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

2010 Mathematics Subject Classification: 62G30, 62E10.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The great interest in nonlinear system identification is mainly due to the fact that a large amount of real systems are complex and need to have their nonlinearities considered so that their models can be successfully used in applications of control, prediction, inference, among others. This work evaluates the application of Fuzzy Wavelet Neural Networks (FWNN) to identify nonlinear dynamical systems subjected to noise and outliers. Generally, these elements cause negative effects on the identification procedure, resulting in erroneous interpretations regarding the dynamical behavior of the system. The FWNN combines in a single structure the ability to deal with uncertainties of fuzzy logic, the multiresolution characteristics of wavelet theory and learning and generalization abilities of the artificial neural networks. Usually, the learning procedure of these neural networks is realized by a gradient based method, which uses the mean squared error as its cost function. This work proposes the replacement of this traditional function by an Information Theoretic Learning similarity measure, called correntropy. With the use of this similarity measure, higher order statistics can be considered during the FWNN training process. For this reason, this measure is more suitable for non-Gaussian error distributions and makes the training less sensitive to the presence of outliers. In order to evaluate this replacement, FWNN models are obtained in two identification case studies: a real nonlinear system, consisting of a multisection tank, and a simulated system based on a model of the human knee joint. The results demonstrate that the application of correntropy as the error backpropagation algorithm cost function makes the identification procedure using FWNN models more robust to outliers. However, this is only achieved if the gaussian kernel width of correntropy is properly adjusted.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We study the statistical properties of orientation and rotation dynamics of elliptical tracer particles in two-dimensional, homogeneous, and isotropic turbulence by direct numerical simulations. We consider both the cases in which the turbulent flow is generated by forcing at large and intermediate length scales. We show that the two cases are qualitatively different. For large-scale forcing, the spatial distribution of particle orientations forms large-scale structures, which are absent for intermediate-scale forcing. The alignment with the local directions of the flow is much weaker in the latter case than in the former. For intermediate-scale forcing, the statistics of rotation rates depends weakly on the Reynolds number and on the aspect ratio of particles. In contrast with what is observed in three-dimensional turbulence, in two dimensions the mean-square rotation rate increases as the aspect ratio increases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An expression for the probability density function of the second order response of a general FPSO in spreading seas is derived by using the Kac-Siegert approach. Various approximations of the second order force transfer functions are investigated for a ship-shaped FPSO. It is found that, when expressed in non-dimensional form, the probability density function of the response is not particularly sensitive to wave spreading, although the mean squared response and the resulting dimensional extreme values can be sensitive. The analysis is then applied to a Sevan FPSO, which is a large cylindrical buoy-like structure. The second order force transfer functions are derived by using an efficient semi-analytical hydrodynamic approach, and these are then employed to yield the extreme response. However, a significant effect of wave spreading on the statistics for a Sevan FPSO is found even in non-dimensional form. It implies that the exact statistics of a general ship-shaped FPSO may be sensitive to the wave direction, which needs to be verified in future work. It is also pointed out that the Newman's approximation regarding the frequency dependency of force transfer function is acceptable even for the spreading seas. An improvement on the results may be attained when considering the angular dependency exactly. Copyright © 2009 by ASME.