892 resultados para Numbers, Random


Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: 1. To critically evaluate a variety of mathematical methods of calculating effective population size (Ne) by conducting comprehensive computer simulations and by analysis of empirical data collected from the Moreton Bay population of tiger prawns. 2. To lay the groundwork for the application of the technology in the NPF. 3. To produce software for the calculation of Ne, and to make it widely available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most cellular solids are random materials, while practically all theoretical structure-property results are for periodic models. To be able to generate theoretical results for random models, the finite element method (FEM) was used to study the elastic properties of solids with a closed-cell cellular structure. We have computed the density (rho) and microstructure dependence of the Young's modulus (E) and Poisson's ratio (PR) for several different isotropic random models based on Voronoi tessellations and level-cut Gaussian random fields. The effect of partially open cells is also considered. The results, which are best described by a power law E infinity rho (n) (1<n<2), show the influence of randomness and isotropy on the properties of closed-cell cellular materials, and are found to be in good agreement with experimental data. (C) 2001 Acta Materialia Inc. Published by Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A mixture model incorporating long-term survivors has been adopted in the field of biostatistics where some individuals may never experience the failure event under study. The surviving fractions may be considered as cured. In most applications, the survival times are assumed to be independent. However, when the survival data are obtained from a multi-centre clinical trial, it is conceived that the environ mental conditions and facilities shared within clinic affects the proportion cured as well as the failure risk for the uncured individuals. It necessitates a long-term survivor mixture model with random effects. In this paper, the long-term survivor mixture model is extended for the analysis of multivariate failure time data using the generalized linear mixed model (GLMM) approach. The proposed model is applied to analyse a numerical data set from a multi-centre clinical trial of carcinoma as an illustration. Some simulation experiments are performed to assess the applicability of the model based on the average biases of the estimates formed. Copyright (C) 2001 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have developed a highly sensitive cytolysis test, the fluorolysis assay, as a simple nonradioactive and inexpensive alternative to the standard Cr-51-release assay. P815 cells were stably transfected with a plasmid expressing the enhanced green fluorescent protein (EGFP) gene. These target cells were coated with or without cognate peptide or anti-CD3 Ab and then incubated with CD8(+) T cells to allow antigen-specific or nonspecific lysis. The degree of target cell lysis was measured using flow cytometry to count the percentage of viable propidium iodide(-) EGFP(+) cells, whose numbers were standardized to a reference number of fluorochrome-linked beads. By using small numbers of target cells (200-800 per reaction) and extended incubation times (up to 2 days), the antigen-specific cytolytic activity of one to two activated CD8(+) T cells of a CTL line could be detected. The redirected fluorolysis assay also measured the activity of very few ( greater than or equal to6) primary CD8(+) T cells following polyclonal activation. Importantly, antigen-specific lysis by small numbers ( greater than or equal to 25) of primary CD8(+) T cells could be directly measured ex vivo. This exquisite sensitivity of the fluorolysis assay, which was at least 8-33-folds higher than an optimized 51 Cr-release assay, allows in vitro and ex vivo studies of immune responses that would otherwise not be possible due to low CTL numbers or frequencies. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A finite-element method is used to study the elastic properties of random three-dimensional porous materials with highly interconnected pores. We show that Young's modulus, E, is practically independent of Poisson's ratio of the solid phase, nu(s), over the entire solid fraction range, and Poisson's ratio, nu, becomes independent of nu(s) as the percolation threshold is approached. We represent this behaviour of nu in a flow diagram. This interesting but approximate behaviour is very similar to the exactly known behaviour in two-dimensional porous materials. In addition, the behaviour of nu versus nu(s) appears to imply that information in the dilute porosity limit can affect behaviour in the percolation threshold limit. We summarize the finite-element results in terms of simple structure-property relations, instead of tables of data, to make it easier to apply the computational results. Without using accurate numerical computations, one is limited to various effective medium theories and rigorous approximations like bounds and expansions. The accuracy of these equations is unknown for general porous media. To verify a particular theory it is important to check that it predicts both isotropic elastic moduli, i.e. prediction of Young's modulus alone is necessary but not sufficient. The subtleties of Poisson's ratio behaviour actually provide a very effective method for showing differences between the theories and demonstrating their ranges of validity. We find that for moderate- to high-porosity materials, none of the analytical theories is accurate and, at present, numerical techniques must be relied upon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, several groups have investigated quantum analogues of random walk algorithms, both on a line and on a circle. It has been found that the quantum versions have markedly different features to the classical versions. Namely, the variance on the line, and the mixing time on the circle increase quadratically faster in the quantum versions as compared to the classical versions. Here, we propose a scheme to implement the quantum random walk on a line and on a circle in an ion trap quantum computer. With current ion trap technology, the number of steps that could be experimentally implemented will be relatively small. However, we show how the enhanced features of these walks could be observed experimentally. In the limit of strong decoherence, the quantum random walk tends to the classical random walk. By measuring the degree to which the walk remains quantum, '' this algorithm could serve as an important benchmarking protocol for ion trap quantum computers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To describe new measures of risk from case-control and cohort studies, which are simple to understand and relate to numbers of the population at risk. Design: Theoretical development of new measures of risk. Setting: Review of literature and previously described measures. Main results: The new measures are: (1) the population impact number (PIN), the number of those in the whole population among whom one case is attributable to the exposure or risk factor (this is equivalent to the reciprocal of the population attributable risk),- (2) the case impact number (CIN) the number of people with the disease or outcome for whom one case will be attributable to the exposure or risk factor (this is equivalent to the reciprocal of the population attributable fraction); (3) the exposure impact number (EIN) the number of people with the exposure among whom one excess case is attributable to the exposure (this is equivalent to the reciprocal of the attributable risk); (4) the exposed cases impact number (ECIN) the number of exposed cases among whom one case is attributable to the exposure (this is equivalent to the reciprocal of the aetiological fraction). The impact number reflects the number of people in each population (the whole population, the cases, all those exposed, and the exposed cases) among whom one case is attributable to the particular risk factor. Conclusions: These new measures should help communicate the impact on a population, of estimates of risk derived from cohort or case-control studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To outline the major methodological issues appropriate to the use of the population impact number (PIN) and the disease impact number (DIN) in health policy decision making. Design: Review of literature and calculation of PIN and DIN statistics in different settings. Setting: Previously proposed extensions to the number needed to treat (NNT): the DIN and the PIN, which give a population perspective to this measure. Main results: The PIN and DIN allow us to compare the population impact of different interventions either within the same disease or in different diseases or conditions. The primary studies used for relative risk estimates should have outcomes, time periods and comparison groups that are congruent and relevant to the local setting. These need to be combined with local data on disease rates and population size. Depending on the particular problem, the target may be disease incidence or prevalence and the effects of interest may be either the incremental impact or the total impact of each intervention. For practical application, it will be important to use sensitivity analyses to determine plausible intervals for the impact numbers. Conclusions: Attention to various methodological issues will permit the DIN and PIN to be used to assist health policy makers assign a population perspective to measures of risk.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new approach to the LU decomposition method for the simulation of stationary and ergodic random fields. The approach overcomes the size limitations of LU and is suitable for any size simulation. The proposed approach can facilitate fast updating of generated realizations with new data, when appropriate, without repeating the full simulation process. Based on a novel column partitioning of the L matrix, expressed in terms of successive conditional covariance matrices, the approach presented here demonstrates that LU simulation is equivalent to the successive solution of kriging residual estimates plus random terms. Consequently, it can be used for the LU decomposition of matrices of any size. The simulation approach is termed conditional simulation by successive residuals as at each step, a small set (group) of random variables is simulated with a LU decomposition of a matrix of updated conditional covariance of residuals. The simulated group is then used to estimate residuals without the need to solve large systems of equations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sensitivity of output of a linear operator to its input can be quantified in various ways. In Control Theory, the input is usually interpreted as disturbance and the output is to be minimized in some sense. In stochastic worst-case design settings, the disturbance is considered random with imprecisely known probability distribution. The prior set of probability measures can be chosen so as to quantify how far the disturbance deviates from the white-noise hypothesis of Linear Quadratic Gaussian control. Such deviation can be measured by the minimal Kullback-Leibler informational divergence from the Gaussian distributions with zero mean and scalar covariance matrices. The resulting anisotropy functional is defined for finite power random vectors. Originally, anisotropy was introduced for directionally generic random vectors as the relative entropy of the normalized vector with respect to the uniform distribution on the unit sphere. The associated a-anisotropic norm of a matrix is then its maximum root mean square or average energy gain with respect to finite power or directionally generic inputs whose anisotropy is bounded above by a≥0. We give a systematic comparison of the anisotropy functionals and the associated norms. These are considered for unboundedly growing fragments of homogeneous Gaussian random fields on multidimensional integer lattice to yield mean anisotropy. Correspondingly, the anisotropic norms of finite matrices are extended to bounded linear translation invariant operators over such fields.