964 resultados para propellant estimates


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analytical expressions are derived for the mean and variance, of estimates of the bispectrum of a real-time series assuming a cosinusoidal model. The effects of spectral leakage, inherent in discrete Fourier transform operation when the modes present in the signal have a nonintegral number of wavelengths in the record, are included in the analysis. A single phase-coupled triad of modes can cause the bispectrum to have a nonzero mean value over the entire region of computation owing to leakage. The variance of bispectral estimates in the presence of leakage has contributions from individual modes and from triads of phase-coupled modes. Time-domain windowing reduces the leakage. The theoretical expressions for the mean and variance of bispectral estimates are derived in terms of a function dependent on an arbitrary symmetric time-domain window applied to the record. the number of data, and the statistics of the phase coupling among triads of modes. The theoretical results are verified by numerical simulations for simple test cases and applied to laboratory data to examine phase coupling in a hypothesis testing framework

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Studies of molecular evolutionary rates have yielded a wide range of rate estimates for various genes and taxa. Recent studies based on population-level and pedigree data have produced remarkably high estimates of mutation rate, which strongly contrast with substitution rates inferred in phylogenetic (species-level) studies. Using Bayesian analysis with a relaxed-clock model, we estimated rates for three groups of mitochondrial data: avian protein-coding genes, primate protein-coding genes, and primate d-loop sequences. In all three cases, we found a measurable transition between the high, short-term (<1–2 Myr) mutation rate and the low, long-term substitution rate. The relationship between the age of the calibration and the rate of change can be described by a vertically translated exponential decay curve, which may be used for correcting molecular date estimates. The phylogenetic substitution rates in mitochondria are approximately 0.5% per million years for avian protein-coding sequences and 1.5% per million years for primate protein-coding and d-loop sequences. Further analyses showed that purifying selection offers the most convincing explanation for the observed relationship between the estimated rate and the depth of the calibration. We rule out the possibility that it is a spurious result arising from sequence errors, and find it unlikely that the apparent decline in rates over time is caused by mutational saturation. Using a rate curve estimated from the d-loop data, several dates for last common ancestors were calculated: modern humans and Neandertals (354 ka; 222–705 ka), Neandertals (108 ka; 70–156 ka), and modern humans (76 ka; 47–110 ka). If the rate curve for a particular taxonomic group can be accurately estimated, it can be a useful tool for correcting divergence date estimates by taking the rate decay into account. Our results show that it is invalid to extrapolate molecular rates of change across different evolutionary timescales, which has important consequences for studies of populations, domestication, conservation genetics, and human evolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Long-term changes in the genetic composition of a population occur by the fixation of new mutations, a process known as substitution. The rate at which mutations arise in a population and the rate at which they are fixed are expected to be equal under neutral conditions (Kimura, 1968). Between the appearance of a new mutation and its eventual fate of fixation or loss, there will be a period in which it exists as a transient polymorphism in the population (Kimura and Ohta, 1971). If the majority of mutations are deleterious (and nonlethal), the fixation probabilities of these transient polymorphisms are reduced and the mutation rate will exceed the substitution rate (Kimura, 1983). Consequently, different apparent rates may be observed on different time scales of the molecular evolutionary process (Penny, 2005; Penny and Holmes, 2001). The substitution rate of the mitochondrial protein-coding genes of birds and mammals has been traditionally recognized to be about 0.01 substitutions/site/million years (Myr) (Brown et al., 1979; Ho, 2007; Irwin et al., 1991; Shields and Wilson, 1987), with the noncoding D-loop evolving several times more quickly (e.g., Pesole et al., 1992; Quinn, 1992). Over the past decade, there has been mounting evidence that instantaneous mutation rates substantially exceed substitution rates, in a range of organisms (e.g., Denver et al., 2000; Howell et al., 2003; Lambert et al., 2002; Mao et al., 2006; Mumm et al., 1997; Parsons et al., 1997; Santos et al., 2005). The immediate reaction to the first of these findings was that the polymorphisms generated by the elevated mutation rate are short-lived, perhaps extending back only a few hundred years (Gibbons, 1998; Macaulay et al., 1997). That is, purifying selection was thought to remove these polymorphisms very rapidly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sequence data often have competing signals that are detected by network programs or Lento plots. Such data can be formed by generating sequences on more than one tree, and combining the results, a mixture model. We report that with such mixture models, the estimates of edge (branch) lengths from maximum likelihood (ML) methods that assume a single tree are biased. Based on the observed number of competing signals in real data, such a bias of ML is expected to occur frequently. Because network methods can recover competing signals more accurately, there is a need for ML methods allowing a network. A fundamental problem is that mixture models can have more parameters than can be recovered from the data, so that some mixtures are not, in principle, identifiable. We recommend that network programs be incorporated into best practice analysis, along with ML and Bayesian trees.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This CDROM includes PDFs of presentations on the following topics: "TXDOT Revenue and Expenditure Trends;" "Examine Highway Fund Diversions, & Benchmark Texas Vehicle Registration Fees;" "Evaluation of the JACK Model;" "Future highway construction cost trends;" "Fuel Efficiency Trends and Revenue Impact"

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper establishes sufficient conditions to bound the error in perturbed conditional mean estimates derived from a perturbed model (only the scalar case is shown in this paper but a similar result is expected to hold for the vector case). The results established here extend recent stability results on approximating information state filter recursions to stability results on the approximate conditional mean estimates. The presented filter stability results provide bounds for a wide variety of model error situations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: National physical activity data suggest that there is a considerable difference in physical activity levels of US and Australian adults. Although different surveys (Active Australia and BRFSS) are used, the questions are similar. Different protocols, however, are used to estimate “activity” from the data collected. The primary aim of this study was to assess whether the 2 approaches to the management of PA data could explain some of the difference in prevalence estimates derived from the two national surveys. Methods: Secondary data analysis of the most recent AA survey (N = 2987). Results: 15% of the sample was defined as “active” using Australian criteria but as “inactive” using the BRFSS protocol, even though weekly energy expenditure was commensurate with meeting current guidelines. Younger respondents (age < 45 y) were more likely to be “misclassified” using the BRFSS criteria. Conclusions: The prevalence of activity in Australia and the US appears to be more similar than we had previously thought.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During a major flood event, the inundation of urban environments leads to some complicated flow motion most often associated with significant sediment fluxes. In the present study, a series of field measurements were conducted in an inundated section of the City of Brisbane (Australia) about the peak of a major flood in January 2011. Some experiments were performed to use ADV backscatter amplitude as a surrogate estimate of the suspended sediment concentration (SSC) during the flood event. The flood water deposit samples were predominantly silty material with a median particle size about 25 μm and they exhibited a non-Newtonian behavior under rheological testing. In the inundated urban environment during the flood, estimates of suspended sediment concentration presented a general trend with increasing SSC for decreasing water depth. The suspended sediment flux data showed some substantial sediment flux amplitudes consistent with the murky appearance of floodwaters. Altogether the results highlighted the large suspended sediment loads and fluctuations in the inundated urban setting associated possibly with a non-Newtonian behavior. During the receding flood, some unusual long-period oscillations were observed (periods about 18 min), although the cause of these oscillations remains unknown. The field deployment was conducted in challenging conditions highlighting a number of practical issues during a natural disaster.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: A random QTL effects model uses a function of probabilities that two alleles in the same or in different animals at a particular genomic position are identical by descent (IBD). Estimates of such IBD probabilities and therefore, modeling and estimating QTL variances, depend on marker polymorphism, strength of linkage and linkage disequilibrium of markers and QTL, and the relatedness of animals in the pedigree. The effect of relatedness of animals in a pedigree on IBD probabilities and their characteristics was examined in a simulation study. Results: The study based on nine multi-generational family structures, similar to a pedigree structure of a real dairy population, distinguished by an increased level of inbreeding from zero to 28 % across the studied population. Highest inbreeding level in the pedigree, connected with highest relatedness, was accompanied by highest IBD probabilities of two alleles at the same locus, and by lower relative variation coefficients. Profiles of correlation coefficients of IBD probabilities along the marked chromosomal segment with those at the true QTL position were steepest when the inbreeding coefficient in the pedigree was highest. Precision of estimated QTL location increased with increasing inbreeding and pedigree relatedness. A method to assess the optimum level of inbreeding for QTL detection is proposed, depending on population parameters. Conclusions: An increased overall relationship in a QTL mapping design has positive effects on precision of QTL position estimates. But the relationship of inbreeding level and the capacity for QTL detection depending on the recombination rate of QTL and adjacent informative marker is not linear. © 2010 Freyer et al., licensee BioMed Central Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Emerging sciences, such as conceptual cost estimating, seem to have to go through two phases. The first phase involves reducing the field of study down to its basic ingredients - from systems development to technological development (techniques) to theoretical development. The second phase operates in the direction in building up techniques from theories, and systems from techniques. Cost estimating is clearly and distinctly still in the first phase. A great deal of effort has been put into the development of both manual and computer based cost estimating systems during this first phase and, to a lesser extent, the development of a range of techniques that can be used (see, for instance, Ashworth & Skitmore, 1986). Theoretical developments have not, as yet, been forthcoming. All theories need the support of some observational data and cost estimating is not likely to be an exception. These data do not need to be complete in order to build theories. As it is possible to construct an image of a prehistoric animal such as the brontosaurus from only a few key bones and relics, so a theory of cost estimating may possibly be found on a few factual details. The eternal argument of empiricists and deductionists is that, as theories need factual support, so do we need theories in order to know what facts to collect. In cost estimating, the basic facts of interest concern accuracy, the cost of achieving this accuracy, and the trade off between the two. When cost estimating theories do begin to emerge, it is highly likely that these relationships will be central features. This paper presents some of the facts we have been able to acquire regarding one part of this relationship - accuracy, and its influencing factors. Although some of these factors, such as the amount of information used in preparing the estimate, will have cost consequences, we have not yet reached the stage of quantifying these costs. Indeed, as will be seen, many of the factors do not involve any substantial cost considerations. The absence of any theory is reflected in the arbitrary manner in which the factors are presented. Rather, the emphasis here is on the consideration of purely empirical data concerning estimating accuracy. The essence of good empirical research is to .minimize the role of the researcher in interpreting the results of the study. Whilst space does not allow a full treatment of the material in this manner, the principle has been adopted as closely as possible to present results in an uncleaned and unbiased way. In most cases the evidence speaks for itself. The first part of the paper reviews most of the empirical evidence that we have located to date. Knowledge of any work done, but omitted here would be most welcome. The second part of the paper presents an analysis of some recently acquired data pertaining to this growing subject.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Previous research employing indirect measures of arch structure, such as those derived from footprints, have indicated that obesity results in a “flatter” foot type. In the absence of radiographic measures, however, definitive conclusions regarding the osseous alignment of the foot cannot be made. We determined the effect of body mass index (BMI) on radiographic and footprint‐based measures of arch structure. The research was a cross‐sectional study in which radiographic and footprint‐based measures of foot structure were made in 30 subjects (10 males, 20 female) in addition to standard anthropometric measures of height, weight, and BMI. Multiple (univariate) regression analysis demonstrated that both BMI ( β  = 0.39, t 26  = 2.12, p  = 0.04) and radiographic arch alignment ( β  = 0.51, t 26  = 3.32, p  < 0.01) were significant predictors of footprint‐based measures of arch height after controlling for all variables in the model ( R 2  = 0.59, F 3,26  = 12.3, p  < 0.01). In contrast, radiographic arch alignment was not significantly associated with BMI ( β  = −0.03, t 26  = −0.13, p  = 0.89) when Arch Index and age were held constant ( R 2  = 0.52, F 3,26  = 9.3, p  < 0.01). Adult obesity does not influence osseous alignment of the medial longitudinal arch, but selectively distorts footprint‐based measures of arch structure. Footprint‐based measures of arch structure should be interpreted with caution when comparing groups of varying body composition.