24 resultados para Completeness pedigree

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resetting of previously accumulated optically stimulated luminescence (OSL) signals during transport of sediment is a fundamental requirement for reliable optical dating. The completeness of optical resetting of 46 modern-age quartz samples from a variety of depositional environments was examined. All equivalent dose (De) estimates were View the MathML source, with the majority of aeolian samples View the MathML source, and fluvial samples View the MathML source. The OSL signal of quartz originates from several trap types with different rates of charge loss during illumination. As such, incomplete bleaching may be identifiable as an increase in De from easy-to-bleach through to hard-to-bleach components. For all modern fluvial samples with non-zero De values, SAR De(t) analysis and component-resolved linearly modulated OSL (LM OSL) De estimates showed this to be the case, implying incomplete resetting of previously accumulated charge. LM OSL measurements were also made to investigate the extent of bleaching of the slow components in the natural environment. In aeolian sediments examined, the natural LM OSL was effectively zero (i.e. all components were fully reset). The slow components of modern fluvial samples displayed measurable residual signals up to 15 Gy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Critical loads are the basis for policies controlling emissions of acidic substances in Europe. The implementation of these policies involves large expenditures, and it is reasonable for policymakers to ask what degree of certainty can be attached to the underlying critical load and exceedance estimates. This paper is a literature review of studies which attempt to estimate the uncertainty attached to critical loads. Critical load models and uncertainty analysis are briefly outlined. Most studies have used Monte Carlo analysis of some form to investigate the propagation of uncertainties in the definition of the input parameters through to uncertainties in critical loads. Though the input parameters are often poorly known, the critical load uncertainties are typically surprisingly small because of a "compensation of errors" mechanism. These results depend on the quality of the uncertainty estimates of the input parameters, and a "pedigree" classification for these is proposed. Sensitivity analysis shows that some input parameters are more important in influencing critical load uncertainty than others, but there have not been enough studies to form a general picture. Methods used for dealing with spatial variation are briefly discussed. Application of alternative models to the same site or modifications of existing models can lead to widely differing critical loads, indicating that research into the underlying science needs to continue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We give a non-commutative generalization of classical symbolic coding in the presence of a synchronizing word. This is done by a scattering theoretical approach. Classically, the existence of a synchronizing word turns out to be equivalent to asymptotic completeness of the corresponding Markov process. A criterion for asymptotic completeness in general is provided by the regularity of an associated extended transition operator. Commutative and non-commutative examples are analysed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This note considers the variance estimation for population size estimators based on capture–recapture experiments. Whereas a diversity of estimators of the population size has been suggested, the question of estimating the associated variances is less frequently addressed. This note points out that the technique of conditioning can be applied here successfully which also allows us to identify sources of variation: the variance due to estimation of the model parameters and the binomial variance due to sampling n units from a population of size N. It is applied to estimators typically used in capture–recapture experiments in continuous time including the estimators of Zelterman and Chao and improves upon previously used variance estimators. In addition, knowledge of the variances associated with the estimators by Zelterman and Chao allows the suggestion of a new estimator as the weighted sum of the two. The decomposition of the variance into the two sources allows also a new understanding of how resampling techniques like the Bootstrap could be used appropriately. Finally, the sample size question for capture–recapture experiments is addressed. Since the variance of population size estimators increases with the sample size, it is suggested to use relative measures such as the observed-to-hidden ratio or the completeness of identification proportion for approaching the question of sample size choice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many populations have recovered from severe bottlenecks either naturally or through intensive conservation management. In the past, however, few conservation programs have monitored the genetic health of recovering populations. We conducted a conservation genetic assessment of a small, reintroduced population of Mauritius Kestrel (Falco punctatus) to determine whether genetic deterioration has occurred since its reintroduction. We used pedigree analysis that partially accounted for individuals of unknown origin to document that (1) inbreeding occurred frequently (2.6% increase per generation; N-el = 18.9), (2) 25% of breeding pairs were composed of either closely or moderately related individuals, (3) genetic diversity has been lost from the population (1,6% loss per generation; N-ev = 32.1) less rapidly than the corresponding increase in inbreeding, and (4) ignoring the contribution of unknown individuals to a pedigree will bias the metrics derived from that pedigree, ultimately obscuring the prevailing genetic dynamics. The rates of inbreeding and loss of genetic variation in the subpopulation of Mauritius Kestrel we examined were extreme and among the highest yet documented in a wild vertebrate population. Thus, genetic deterioration may affect this population's long-term viability. Remedial conservation strategies are needed to reduce the impact of inbreeding and loss of genetic variation in this species, We suggest that schemes to monitor genetic variation after reintroduction should be an integral component of endangered species recovery programs

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genetic parameters and breeding values for dairy cow fertility were estimated from 62 443 lactation records. Two-trait analysis of fertility and milk yield was investigated as a method to estimate fertility breeding values when culling or selection based on milk yield in early lactation determines presence or absence of fertility observations in later lactations. Fertility traits were calving interval, intervals from calving to first service, calving to conception and first to last service, conception success to first service and number of services per conception. Milk production traits were 305-day milk, fat and protein yield. For fertility traits, range of estimates of heritability (h(2)) was 0.012 to 0.028 and of permanent environmental variance (c(2)) was 0.016 to 0.032. Genetic correlations (r(g)) among fertility traits were generally high ( > 0.70). Genetic correlations of fertility with milk production traits were unfavourable (range -0.11 to 0.46). Single and two-trait analyses of fertility were compared using the same data set. The estimates of h(2) and c(2) were similar for two types of analyses. However, there were differences between estimated breeding values and rankings for the same trait from single versus multi-trait analyses. The range for rank correlation was 0.69-0.83 for all animals in the pedigree and 0.89-0.96 for sires with more than 25 daughters. As single-trait method is biased due to selection on milk yield, a multi-trait evaluation of fertility with milk yield is recommended. (C) 2002 Elsevier Science B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This note considers the variance estimation for population size estimators based on capture–recapture experiments. Whereas a diversity of estimators of the population size has been suggested, the question of estimating the associated variances is less frequently addressed. This note points out that the technique of conditioning can be applied here successfully which also allows us to identify sources of variation: the variance due to estimation of the model parameters and the binomial variance due to sampling n units from a population of size N. It is applied to estimators typically used in capture–recapture experiments in continuous time including the estimators of Zelterman and Chao and improves upon previously used variance estimators. In addition, knowledge of the variances associated with the estimators by Zelterman and Chao allows the suggestion of a new estimator as the weighted sum of the two. The decomposition of the variance into the two sources allows also a new understanding of how resampling techniques like the Bootstrap could be used appropriately. Finally, the sample size question for capture–recapture experiments is addressed. Since the variance of population size estimators increases with the sample size, it is suggested to use relative measures such as the observed-to-hidden ratio or the completeness of identification proportion for approaching the question of sample size choice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The separation of mixtures of proteins by SDS-polyacrylamide gel electrophoresis (SDS-PAGE) is a technique that is widely used—and, indeed, this technique underlies many of the assays and analyses that are described in this book. While SDS-PAGE is routine in many labs, a number of issues require consideration before embarking on it for the first time. We felt, therefore, that in the interest of completeness of this volume, a brief chapter describing the basics of SDS-PAGE would be helpful. Also included in this chapter are protocols for the staining of SDS-PAGE gels to visualize separated proteins, and for the electrotransfer of proteins to a membrane support (Western blotting) to enable immunoblotting, for example. This chapter is intended to complement the chapters in this book that require these techniques to be performed. Therefore, detailed examples of why and when these techniques could be used will not be discussed here.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the on-going research performed in order to integrate process automation and process management support in the context of media production. This has been addressed on the basis of a holistic approach to software engineering applied to media production modelling to ensure design correctness, completeness and effectiveness. The focus of the research and development has been to enhance the metadata management throughout the process in a similar fashion to that achieved in Decision Support Systems (DSS) to facilitate well-grounded business decisions. The paper sets out the aims and objectives and the methodology deployed. The paper describes the solution in some detail and sets out some preliminary conclusions and the planned future work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a novel topology of the radial basis function (RBF) neural network, referred to as the boundary value constraints (BVC)-RBF, which is able to automatically satisfy a set of BVC. Unlike most existing neural networks whereby the model is identified via learning from observational data only, the proposed BVC-RBF offers a generic framework by taking into account both the deterministic prior knowledge and the stochastic data in an intelligent manner. Like a conventional RBF, the proposed BVC-RBF has a linear-in-the-parameter structure, such that it is advantageous that many of the existing algorithms for linear-in-the-parameters models are directly applicable. The BVC satisfaction properties of the proposed BVC-RBF are discussed. Finally, numerical examples based on the combined D-optimality-based orthogonal least squares algorithm are utilized to illustrate the performance of the proposed BVC-RBF for completeness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces a new neurofuzzy model construction algorithm for nonlinear dynamic systems based upon basis functions that are Bezier-Bernstein polynomial functions. This paper is generalized in that it copes with n-dimensional inputs by utilising an additive decomposition construction to overcome the curse of dimensionality associated with high n. This new construction algorithm also introduces univariate Bezier-Bernstein polynomial functions for the completeness of the generalized procedure. Like the B-spline expansion based neurofuzzy systems, Bezier-Bernstein polynomial function based neurofuzzy networks hold desirable properties such as nonnegativity of the basis functions, unity of support, and interpretability of basis function as fuzzy membership functions, moreover with the additional advantages of structural parsimony and Delaunay input space partition, essentially overcoming the curse of dimensionality associated with conventional fuzzy and RBF networks. This new modeling network is based on additive decomposition approach together with two separate basis function formation approaches for both univariate and bivariate Bezier-Bernstein polynomial functions used in model construction. The overall network weights are then learnt using conventional least squares methods. Numerical examples are included to demonstrate the effectiveness of this new data based modeling approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Information on the genetic variation of plant response to elevated CO2 (e[CO2]) is needed to understand plant adaptation and to pinpoint likely evolutionary response to future high atmospheric CO2 concentrations.• Here, quantitative trait loci (QTL) for above- and below-ground tree growth were determined in a pedigree – an F2 hybrid of poplar (Populus trichocarpa and Populus deltoides), following season-long exposure to either current day ambient CO2 (a[CO2]) or e[CO2] at 600 µl l−1, and genotype by environment interactions investigated.• In the F2 generation, both above- and below-ground growth showed a significant increase in e[CO2]. Three areas of the genome on linkage groups I, IX and XII were identified as important in determining above-ground growth response to e[CO2], while an additional three areas of the genome on linkage groups IV, XVI and XIX appeared important in determining root growth response to e[CO2].• These results quantify and identify genetic variation in response to e[CO2] and provide an insight into genomic response to the changing environment

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The family of theories dubbed ‘luck egalitarianism’ represent an attempt to infuse egalitarian thinking with a concern for personal responsibility, arguing that inequalities are just when they result from, or the extent to which they result from, choice, but are unjust when they result from, or the extent to which they result from, luck. In this essay I argue that luck egalitarians should sometimes seek to limit inequalities, even when they have a fully choice-based pedigree (i.e., result only from the choices of agents). I grant that the broad approach is correct but argue that the temporal standpoint from which we judge whether the person can be held responsible, or the extent to which they can be held responsible, should be radically altered. Instead of asking, as Standard (or Static) Luck Egalitarianism seems to, whether or not, or to what extent, a person was responsible for the choice at the time of choosing, and asking the question of responsibility only once, we should ask whether, or to what extent, they are responsible for the choice at the point at which we are seeking to discover whether, or to what extent, the inequality is just, and so the question of responsibility is not settled but constantly under review. Such an approach will differ from Standard Luck Egalitarianism only if responsibility for a choice is not set in stone – if responsibility can weaken then we should not see the boundary between luck and responsibility within a particular action as static. Drawing on Derek Parfit’s illuminating discussions of personal identity, and contemporary literature on moral responsibility, I suggest there are good reasons to think that responsibility can weaken – that we are not necessarily fully responsible for a choice for ever, even if we were fully responsible at the time of choosing. I call the variant of luck egalitarianism that recognises this shift in temporal standpoint and that responsibility can weaken Dynamic Luck Egalitarianism (DLE). In conclusion I offer a preliminary discussion of what kind of policies DLE would support.