948 resultados para Tobey Mean


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evidence has accumulated that radiation induces a transmissible persistent destabilization of the genome, which mag. result in effects arising in the progeny of irradiated but surviving cells. An enhanced death rate among the progeny of cells surviving irradiation persists for many generations in the form of a reduced plating efficiency. Such delayed reproductive death is correlated with an increased occurrence of micronuclei. Since it has been suggested that radiation-induced chromosomal instability might depend on the radiation quality, we investigated the effects of alpha particles of different LET by looking at the frequency of delayed micronuclei in Chinese hamster V79 cells after cytochalasin-induced block of cell division, A dose-dependent increase in the frequency of micronuclei was found in cells assayed 1 week postirradiation or later. Also, there was a persistent increase in the frequency of dicentrics in surviving irradiated cells, Moreover, we found an increased micronucleus frequency in all of the 30 clones isolated from individual cells which had been irradiated with doses equivalent to either one, two or three alpha-particle traversals per cell nucleus, We conclude that the target for genomic instability in Chinese hamster cells must be larger than the cell nucleus. (C) 1997 by Radiation Research Society

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider the problem of tracking similar objects. We show how a mean field approach can be used to deal with interacting targets and we compare it with Markov Chain Monte Carlo (MCMC). Two mean field implementations are presented. The first one is more general and uses particle filtering. We discuss some simplifications of the base algorithm that reduce the computation time. The second one is based on suitable Gaussian approximations of probability densities that lead to a set of self-consistent equations for the means and covariances. These equations give the Kalman solution if there is no interaction. Experiments have been performed on two kinds of sequences. The first kind is composed of a single long sequence of twenty roaming ants and was previously analysed using MCMC. In this case, our mean field algorithms obtain substantially better results. The second kind corresponds to selected sequences of a football match in which the interaction avoids tracker coalescence in situations where independent trackers fail.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, gradient vector flow (GVF) based algorithms have been successfully used to segment a variety of 2-D and 3-D imagery. However, due to the compromise of internal and external energy forces within the resulting partial differential equations, these methods may lead to biased segmentation results. In this paper, we propose MSGVF, a mean shift based GVF segmentation algorithm that can successfully locate the correct borders. MSGVF is developed so that when the contour reaches equilibrium, the various forces resulting from the different energy terms are balanced. In addition, the smoothness constraint of image pixels is kept so that over- or under-segmentation can be reduced. Experimental results on publicly accessible datasets of dermoscopic and optic disc images demonstrate that the proposed method effectively detects the borders of the objects of interest.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Life science research aims to continuously improve the quality and standard of human life. One of the major challenges in this area is to maintain food safety and security. A number of image processing techniques have been used to investigate the quality of food products. In this paper,we propose a new algorithm to effectively segment connected grains so that each of them can be inspected in a later processing stage. One family of the existing segmentation methods is based on the idea of watersheding, and it has shown promising results in practice.However,due to the over-segmentation issue,this technique has experienced poor performance in various applications,such as inhomogeneous background and connected targets. To solve this problem,we present a combination of two classical techniques to handle this issue.In the first step,a mean shift filter is used to eliminate the inhomogeneous background, where entropy is used to be a converging criterion. Secondly,a color gradient algorithm is used in order to detect the most significant edges, and a marked watershed transform is applied to segment cluttered objects out of the previous processing stages. The proposed framework is capable of compromising among execution time, usability, efficiency and segmentation outcome in analyzing ring die pellets. The experimental results demonstrate that the proposed approach is effectiveness and robust.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dependency on thermal generation and continued wind power growth in Europe due to renewable energy and greenhouse gas emissions targets has resulted in an interesting set of challenges for power systems. The variability of wind power impacts dispatch and balancing by grid operators, power plant operations by generating companies and market wholesale costs. This paper quantifies the effects of high wind power penetration on power systems with a dependency on gas generation using a realistic unit commitment and economic dispatch model. The test system is analyzed under two scenarios, with and without wind, over one year. The key finding of this preliminary study is that despite increased ramping requirements in the wind scenario, the unit cost of electricity due to sub-optimal operation of gas generators does not show substantial deviation from the no wind scenario.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What is meant by the term random? Do we understand how to identify which type of randomisation to use in our future research projects? We, as researchers, often explain randomisation to potential research participants as being a 50/50 chance of selection to either an intervention or control group, akin to drawing numbers out of a hat. Is this an accurate explanation? And are all methods of randomisation equal? This paper aims to guide the researcher through the different techniques used to randomise participants with examples of how they can be used in educational research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The estimates of the zenith wet delay resulting from the analysis of data from space techniques, such as GPS and VLBI, have a strong potential in climate modeling and weather forecast applications. In order to be useful to meteorology, these estimates have to be converted to precipitable water vapor, a process that requires the knowledge of the weighted mean temperature of the atmosphere, which varies both in space and time. In recent years, several models have been proposed to predict this quantity. Using a database of mean temperature values obtained by ray-tracing radiosonde profiles of more than 100 stations covering the globe, and about 2.5 year’s worth of data, we have analyzed several of these models. Based on data from the European region, we have concluded that the models provide identical levels of precision, but different levels of accuracy. Our results indicate that regionally-optimized models do not provide superior performance compared to the global models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Portuguese National Statistical Institute intends to produce estimations for the mean price of the habitation transation.