897 resultados para Unbiased estimating functions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel maximum-likelihood-based algorithm for estimating the distribution of alignment scores from the scores of unrelated sequences in a database search. Using a new method for measuring the accuracy of p-values, we show that our maximum-likelihood-based algorithm is more accurate than existing regression-based and lookup table methods. We explore a more sophisticated way of modeling and estimating the score distributions (using a two-component mixture model and expectation maximization), but conclude that this does not improve significantly over simply ignoring scores with small E-values during estimation. Finally, we measure the classification accuracy of p-values estimated in different ways and observe that inaccurate p-values can, somewhat paradoxically, lead to higher classification accuracy. We explain this paradox and argue that statistical accuracy, not classification accuracy, should be the primary criterion in comparisons of similarity search methods that return p-values that adjust for target sequence length.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distance sampling using line transects has not been previously used or tested for estimating koala abundance. In July 2001, a pilot survey was conducted to compare the use of line transects with strip transects for estimating koala abundance. Both methods provided a similar estimate of density. On the basis of the results of the pilot survey, the distribution and abundance of koalas in the Pine Rivers Shire, south-east Queensland, was determined using line-transect sampling. In total, 134 lines (length 64 km) were used to sample bushland areas. Eighty-two independent koalas were sighted. Analysis of the frequency distribution of sighting distances using the software program DISTANCE enabled a global detection function to be estimated for survey sites in bushland areas across the Shire. Abundance in urban parts of the Shire was estimated from densities obtained from total counts at eight urban sites that ranged from 26 to 51 ha in size. Koala abundance in the Pine Rivers Shire was estimated at 4584 (95% confidence interval, 4040-5247). Line-transect sampling is a useful method for estimating koala abundance provided experienced koala observers are used when conducting surveys.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cropp and Gabric [Ecosystem adaptation: do ecosystems maximise resilience? Ecology. In press] used a simple phytoplanktonzooplankton-nutrient model and a genetic algorithm to determine the parameter values that would maximize the value of certain goal functions. These goal functions were to maximize biomass, maximize flux, maximize flux to biomass ratio, and maximize resilience. It was found that maximizing goal functions maximized resilience. The objective of this study was to investigate whether the Cropp and Gabric [Ecosystem adaptation: do ecosystems maximise resilience? Ecology. In press] result was indicative of a general ecosystem principle, or peculiar to the model and parameter ranges used. This study successfully replicated the Cropp and Gabric [Ecosystem adaptation: do ecosystems maximise resilience? Ecology. In press] experiment for a number of different model types, however, a different interpretation of the results is made. A new metric, concordance, was devised to describe the agreement between goal functions. It was found that resilience has the highest concordance of all goal functions trialled. for most model types. This implies that resilience offers a compromise between the established ecological goal functions. The parameter value range used is found to affect the parameter versus goal function relationships. Local maxima and minima affected the relationship between parameters and goal functions, and between goal functions. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to develop a practical method of estimating energy expenditure (EE) during tennis. Twenty-four elite female tennis players first completed a tennis-specific graded test in which five different intensity levels were applied randomly. Each intensity level was intended to simulate a game of singles tennis and comprised six 14 s periods of activity alternated with 20 s of active rest. Oxygen consumption (VO2) and heart rate (HR) were measured continuously and each player's rate of perceived exertion (RPE) was recorded at the end of each intensity level. Rate of energy expenditure (EEVO2) during the test was calculated using the sum of VO2 during play and the 'O-2 debt' during recovery, divided by the duration of the activity. There were significant individual linear relationships between EEVO2 and RPE, EEVO2 and HR, (rgreater than or equal to0.89 rgreater than or equal to0.93; p

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deterioration of concrete or reinforcing steel through excessive contaminant concentration is often the result of repeated wetting and drying cycles. At each cycle, the absorption of water carries new contaminants into the unsaturated concrete. Nuclear Magnetic Resonance (NMR) is used with large concrete samples to observe the shape of the wetting profile during a simple one-dimensional wetting process. The absorption of water by dry concrete is modelled by a nonlinear diffusion equation with the unsaturated hydraulic diffusivity being a strongly nonlinear function of the moisture content. Exponential and power functions are used for the hydraulic diffusivity and corresponding solutions of the diffusion equation adequately predict the shape of the experimental wetting profile. The shape parameters, describing the wetting profile, vary little between different blends and are relatively insensitive to subsequent re-wetting experiments allowing universal parameters to be suggested for these concretes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a technique for visualising hierarchical and symmetric, multimodal fitness functions that have been investigated in the evolutionary computation literature. The focus of this technique is on landscapes in moderate-dimensional, binary spaces (i.e., fitness functions defined over {0, 1}(n), for n less than or equal to 16). The visualisation approach involves an unfolding of the hyperspace into a two-dimensional graph, whose layout represents the topology of the space using a recursive relationship, and whose shading defines the shape of the cost surface defined on the space. Using this technique we present case-study explorations of three fitness functions: royal road, hierarchical-if-and-only-if (H-IFF), and hierarchically decomposable functions (HDF). The visualisation approach provides an insight into the properties of these functions, particularly with respect to the size and shape of the basins of attraction around each of the local optima.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Education for health is a process in which all public health and medical care personnel are involved. People learn both formally (planned learning experiences) and informally (unplanned learning experiences). Since the patient, the client, the consummer and the community expect public health and medical care personnel to assist them with health and disease issues and problems, the response of the professional "educates" the customer whether the professional intends to educate or not. Therefore, it is incumbent on all public health and medical care professionals to understand their educational functions and their role in health education. It is also important that the role of the specialist in education be clear. The specialist, as to all other specialists, has an in-depth knowledge of his area of expertise, i.e., the teaching/learning process; s/he may function as a consultant to others to enhance the educational potential of their role or s/he may work with a team or with communities or groups of patients. Specific competencies and knowledge are required of the health education specialist; and there is a body of learning and social change theory which provides a frame of reference for planning, implementing and evaluating educational programs. Working with others to enhance their potential to learn and to make informed decisions about health/disease issues is the hallmark of the health education specialist.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We calculate the equilibrium thermodynamic properties, percolation threshold, and cluster distribution functions for a model of associating colloids, which consists of hard spherical particles having on their surfaces three short-ranged attractive sites (sticky spots) of two different types, A and B. The thermodynamic properties are calculated using Wertheim's perturbation theory of associating fluids. This also allows us to find the onset of self-assembly, which can be quantified by the maxima of the specific heat at constant volume. The percolation threshold is derived, under the no-loop assumption, for the correlated bond model: In all cases it is two percolated phases that become identical at a critical point, when one exists. Finally, the cluster size distributions are calculated by mapping the model onto an effective model, characterized by a-state-dependent-functionality (f) over bar and unique bonding probability (p) over bar. The mapping is based on the asymptotic limit of the cluster distributions functions of the generic model and the effective parameters are defined through the requirement that the equilibrium cluster distributions of the true and effective models have the same number-averaged and weight-averaged sizes at all densities and temperatures. We also study the model numerically in the case where BB interactions are missing. In this limit, AB bonds either provide branching between A-chains (Y-junctions) if epsilon(AB)/epsilon(AA) is small, or drive the formation of a hyperbranched polymer if epsilon(AB)/epsilon(AA) is large. We find that the theoretical predictions describe quite accurately the numerical data, especially in the region where Y-junctions are present. There is fairly good agreement between theoretical and numerical results both for the thermodynamic (number of bonds and phase coexistence) and the connectivity properties of the model (cluster size distributions and percolation locus).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We derive a set of differential inequalities for positive definite functions based on previous results derived for positive definite kernels by purely algebraic methods. Our main results show that the global behavior of a smooth positive definite function is, to a large extent, determined solely by the sequence of even-order derivatives at the origin: if a single one of these vanishes then the function is constant; if they are all non-zero and satisfy a natural growth condition, the function is real-analytic and consequently extends holomorphically to a maximal horizontal strip of the complex plane.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seismic recordings of IRIS/IDA/GSN station CMLA and of several temporary stations in the Azores archipelago are processed with P and S receiver function (PRF and SRF) techniques. Contrary to regional seismic tomography these methods provide estimates of the absolute velocities and of the Vp/Vs ratio up to a depth of similar to 300 km. Joint inversion of PRFs and SRFs for a few data sets consistently reveals a division of the subsurface medium into four zones with a distinctly different Vp/Vs ratio: the crust similar to 20 km thick with a ratio of similar to 1.9 in the lower crust, the high-Vs mantle lid with a strongly reduced VpNs velocity ratio relative to the standard 1.8, the low-velocity zone (LVZ) with a velocity ratio of similar to 2.0, and the underlying upper-mantle layer with a standard velocity ratio. Our estimates of crustal thickness greatly exceed previous estimates (similar to 10 km). The base of the high-Vs lid (the Gutenberg discontinuity) is at a depth of-SO km. The LVZ with a reduction of S velocity of similar to 15% relative to the standard (IASP91) model is terminated at a depth of similar to 200 km. The average thickness of the mantle transition zone (TZ) is evaluated from the time difference between the S410p and SKS660p, seismic phases that are robustly detected in the S and SKS receiver functions. This thickness is practically similar to the standard IASP91 value of 250 km. and is characteristic of a large region of the North Atlantic outside the Azores plateau. Our data are indicative of a reduction of the S-wave velocity of several percent relative to the standard velocity in a depth interval from 460 to 500 km. This reduction is found in the nearest vicinities of the Azores, in the region sampled by the PRFs, but, as evidenced by SRFs, it is missing at a distance of a few hundred kilometers from the islands. We speculate that this anomaly may correspond to the source of a plume which generated the Azores hotspot. Previously, a low S velocity in this depth range was found with SRF techniques beneath a few other hotspots.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Financial literature and financial industry use often zero coupon yield curves as input for testing hypotheses, pricing assets or managing risk. They assume this provided data as accurate. We analyse implications of the methodology and of the sample selection criteria used to estimate the zero coupon bond yield term structure on the resulting volatility of spot rates with different maturities. We obtain the volatility term structure using historical volatilities and Egarch volatilities. As input for these volatilities we consider our own spot rates estimation from GovPX bond data and three popular interest rates data sets: from the Federal Reserve Board, from the US Department of the Treasury (H15), and from Bloomberg. We find strong evidence that the resulting zero coupon bond yield volatility estimates as well as the correlation coefficients among spot and forward rates depend significantly on the data set. We observe relevant differences in economic terms when volatilities are used to price derivatives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O presente trabalho teve como objectivos avaliar a influência de diversas grandezas e parâmetros de ensaio no índice de fluidez de termoplásticos e calcular a incerteza associada às determinações. Numa primeira fase, procedeu-se à identificação dos principais parâmetros que influenciam a determinação do índice de fluidez, tendo sido seleccionados a temperatura do plastómetro, o peso de carga, o diâmetro da fieira, o comprimento da medição, o tipo de corte e o número de provetes. Para avaliar a influência destes parâmetros na medição do índice de fluidez, optou-se pela realização de um planeamento de experiências, o qual foi dividido em três etapas. Para o tratamento dos resultados obtidos utilizou-se como ferramenta a análise de variância. Após a completa análise dos desenhos factoriais, verificou-se que os efeitos dos factores temperatura do plastómetro, peso de carga e diâmetro da fieira apresentam um importante significado estatístico na medição do índice de fluidez. Na segunda fase, procedeu-se ao cálculo da incerteza associada às medições. Para tal seleccionou-se um dos métodos mais usuais, referido no Guia para a Expressão da Incerteza da Medição, conhecido como método GUM, e pela utilização da abordagem “passo a passo”. Inicialmente, foi necessária a construção de um modelo matemático para a medição do índice de fluidez que relacionasse os diferentes parâmetros utilizados. Foi estudado o comportamento de cada um dos parâmetros através da utilização de duas funções, recorrendo-se novamente à análise de variância. Através da lei de propagação das incertezas foi possível determinar a incerteza padrão combinada,e após estimativa do número de graus de liberdade, foi possível determinar o valor do coeficiente de expansão. Finalmente determinou-se a incerteza expandida da medição, relativa à determinação do índice de fluidez em volume.