978 resultados para Expected values
Resumo:
This review assembles pedometry literature focused on youth, with particular attention to expected values for habitual, school day, physical education class, recess, lunch break, out-of-school, weekend, and vacation activity. From 31 studies published since 1999, we constructed a youth habitual activity step-curve that indicates: (a) from ages 6 to 18 years, boys typically take more steps per day than girls; (b) for both sexes the youngest age groups appear to take fewer steps per day than those immediately older; and (c) from a young age, boys decline more in steps per day to become move consistent with girls at older ages. Additional studies revealed that boys take approximately 42-49% of daily steps during the school day; girls take 41-47%. Steps taken during physical education class contribute to total steps per day by 8.7-23.7% in boys and 11.4-17.2% in girls. Recess represents 8-11% and lunch break represents 15-16% of total steps per day. After-school activity contributes approximately 47-56% of total steps per day for boys and 47-59% for girls. Weekdays range from approximately 12,000 to 16,000 steps per day in boys and 10,000 to 14,000 steps per day in girls. The corresponding values for weekend days are 12,000-13,000 steps per day in boys and 10,000-12,000 steps per day in girls.
Resumo:
The figure Beets took exception to displays sex‐ and age‐specific median values of aggregated published expected values for pedometer determined physical activity.
Resumo:
The purpose of this review is to update expected values for pedometer-determined physical activity in free-living healthy older populations. A search of the literature published since 2001 began with a keyword (pedometer, "step counter," "step activity monitor" or "accelerometer AND steps/day") search of PubMed, Cumulative Index to Nursing & Allied Health Literature (CINAHL), SportDiscus, and PsychInfo. An iterative process was then undertaken to abstract and verify studies of pedometer-determined physical activity (captured in terms of steps taken; distance only was not accepted) in free-living adult populations described as ≥ 50 years of age (studies that included samples which spanned this threshold were not included unless they provided at least some appropriately age-stratified data) and not specifically recruited based on any chronic disease or disability. We identified 28 studies representing at least 1,343 males and 3,098 females ranging in age from 50–94 years. Eighteen (or 64%) of the studies clearly identified using a Yamax pedometer model. Monitoring frames ranged from 3 days to 1 year; the modal length of time was 7 days (17 studies, or 61%). Mean pedometer-determined physical activity ranged from 2,015 steps/day to 8,938 steps/day. In those studies reporting such data, consistent patterns emerged: males generally took more steps/day than similarly aged females, steps/day decreased across study-specific age groupings, and BMI-defined normal weight individuals took more steps/day than overweight/obese older adults. The range of 2,000–9,000 steps/day likely reflects the true variability of physical activity behaviors in older populations. More explicit patterns, for example sex- and age-specific relationships, remain to be informed by future research endeavors.
Resumo:
Objective To assemble expected values for free-living steps/day in special populations living with chronic illnesses and disabilities. Method Studies identified since 2000 were categorized into similar illnesses and disabilities, capturing the original reference, sample descriptions, descriptions of instruments used (i.e., pedometers, piezoelectric pedometers, accelerometers), number of days worn, and mean and standard deviation of steps/day. Results Sixty unique studies represented: 1) heart and vascular diseases, 2) chronic obstructive lung disease, 3) diabetes and dialysis, 4) breast cancer, 5) neuromuscular diseases, 6) arthritis, joint replacement, and fibromyalgia, 7) disability (including mental retardation/intellectual difficulties), and 8) other special populations. A median steps/day was calculated for each category. Waist-mounted and ankle-mounted instruments were considered separately due to fundamental differences in assessment properties. For waist-mounted instruments, the lowest median values for steps/day are found in disabled older adults (1214 steps/day) followed by people living with COPD (2237 steps/day). The highest values were seen in individuals with Type 1 diabetes (8008 steps/day), mental retardation/intellectual disability (7787 steps/day), and HIV (7545 steps/day). Conclusion This review will be useful to researchers/practitioners who work with individuals living with chronic illness and disability and require such information for surveillance, screening, intervention, and program evaluation purposes. Keywords: Exercise; Walking; Ambulatory monitoring
Resumo:
Includes bibliography
Resumo:
Geochemical variations in shallow water corals provide a valuable archive of paleoclimatic information. However, biological effects can complicate the interpretation of these proxies, forcing their application to rely on empirical calibrations. Carbonate clumped isotope thermometry (Delta47) is a novel paleotemperature proxy based on the temperature dependent "clumping" of 13C-18O bonds. Similar ?47-temperature relationships in inorganically precipitated calcite and a suite of biogenic carbonates provide evidence that carbonate clumped isotope variability may record absolute temperature without a biological influence. However, large departures from expected values in the winter growth of a hermatypic coral provided early evidence for possible Delta47 vital effects. Here, we present the first systematic survey of Delta47 in shallow water corals. Sub-annual Red Sea Delta47 in two Porites corals shows a temperature dependence similar to inorganic precipitation experiments, but with a systematic offset toward higher Delta47 values that consistently underestimate temperature by ~8 °C. Additional analyses of Porites, Siderastrea, Astrangia and Caryophyllia corals argue against a number of potential mechanisms as the leading cause for this apparent Delta47 vital effect including: salinity, organic matter contamination, alteration during sampling, the presence or absence of symbionts, and interlaboratory differences in analytical protocols. However, intra- and inter-coral comparisons suggest that the deviation from expected Delta47 increases with calcification rate. Theoretical calculations suggest this apparent link with calcification rate is inconsistent with pH-dependent changes in dissolved inorganic carbon speciation and with kinetic effects associated with CO2 diffusion into the calcifying space. However, the link with calcification rate may be related to fractionation during the hydration/hydroxylation of CO2 within the calcifying space. Although the vital effects we describe will complicate the interpretation of Delta47 as a paleothermometer in shallow water corals, it may still be a valuable paleoclimate proxy, particularly when applied as part of a multi-proxy approach.
Resumo:
This paper develops analytical distributions of temperature indices on which temperature derivatives are written. If the deviations of daily temperatures from their expected values are modelled as an Ornstein-Uhlenbeck process with timevarying variance, then the distributions of the temperature index on which the derivative is written is the sum of truncated, correlated Gaussian deviates. The key result of this paper is to provide an analytical approximation to the distribution of this sum, thus allowing the accurate computation of payoffs without the need for any simulation. A data set comprising average daily temperature spanning over a hundred years for four Australian cities is used to demonstrate the efficacy of this approach for estimating the payoffs to temperature derivatives. It is demonstrated that expected payoffs computed directly from historical records are a particularly poor approach to the problem when there are trends in underlying average daily temperature. It is shown that the proposed analytical approach is superior to historical pricing.
Resumo:
In this thesis we investigate the use of quantum probability theory for ranking documents. Quantum probability theory is used to estimate the probability of relevance of a document given a user's query. We posit that quantum probability theory can lead to a better estimation of the probability of a document being relevant to a user's query than the common approach, i. e. the Probability Ranking Principle (PRP), which is based upon Kolmogorovian probability theory. Following our hypothesis, we formulate an analogy between the document retrieval scenario and a physical scenario, that of the double slit experiment. Through the analogy, we propose a novel ranking approach, the quantum probability ranking principle (qPRP). Key to our proposal is the presence of quantum interference. Mathematically, this is the statistical deviation between empirical observations and expected values predicted by the Kolmogorovian rule of additivity of probabilities of disjoint events in configurations such that of the double slit experiment. We propose an interpretation of quantum interference in the document ranking scenario, and examine how quantum interference can be effectively estimated for document retrieval. To validate our proposal and to gain more insights about approaches for document ranking, we (1) analyse PRP, qPRP and other ranking approaches, exposing the assumptions underlying their ranking criteria and formulating the conditions for the optimality of the two ranking principles, (2) empirically compare three ranking principles (i. e. PRP, interactive PRP, and qPRP) and two state-of-the-art ranking strategies in two retrieval scenarios, those of ad-hoc retrieval and diversity retrieval, (3) analytically contrast the ranking criteria of the examined approaches, exposing similarities and differences, (4) study the ranking behaviours of approaches alternative to PRP in terms of the kinematics they impose on relevant documents, i. e. by considering the extent and direction of the movements of relevant documents across the ranking recorded when comparing PRP against its alternatives. Our findings show that the effectiveness of the examined ranking approaches strongly depends upon the evaluation context. In the traditional evaluation context of ad-hoc retrieval, PRP is empirically shown to be better or comparable to alternative ranking approaches. However, when we turn to examine evaluation contexts that account for interdependent document relevance (i. e. when the relevance of a document is assessed also with respect to other retrieved documents, as it is the case in the diversity retrieval scenario) then the use of quantum probability theory and thus of qPRP is shown to improve retrieval and ranking effectiveness over the traditional PRP and alternative ranking strategies, such as Maximal Marginal Relevance, Portfolio theory, and Interactive PRP. This work represents a significant step forward regarding the use of quantum theory in information retrieval. It demonstrates in fact that the application of quantum theory to problems within information retrieval can lead to improvements both in modelling power and retrieval effectiveness, allowing the constructions of models that capture the complexity of information retrieval situations. Furthermore, the thesis opens up a number of lines for future research. These include: (1) investigating estimations and approximations of quantum interference in qPRP; (2) exploiting complex numbers for the representation of documents and queries, and; (3) applying the concepts underlying qPRP to tasks other than document ranking.
Resumo:
This paper examines the properties of various approximation methods for solving stochastic dynamic programs in structural estimation problems. The problem addressed is evaluating the expected value of the maximum of available choices. The paper shows that approximating this by the maximum of expected values frequently has poor properties. It also shows that choosing a convenient distributional assumptions for the errors and then solving exactly conditional on the distributional assumption leads to small approximation errors even if the distribution is misspecified. © 1997 Cambridge University Press.
Resumo:
To facilitate marketing and export, the Australian macadamia industry requires accurate crop forecasts. Each year, two levels of crop predictions are produced for this industry. The first is an overall longer-term forecast based on tree census data of growers in the Australian Macadamia Society (AMS). This data set currently accounts for around 70% of total production, and is supplemented by our best estimates of non-AMS orchards. Given these total tree numbers, average yields per tree are needed to complete the long-term forecasts. Yields from regional variety trials were initially used, but were found to be consistently higher than the average yields that growers were obtaining. Hence, a statistical model was developed using growers' historical yields, also taken from the AMS database. This model accounted for the effects of tree age, variety, year, region and tree spacing, and explained 65% of the total variation in the yield per tree data. The second level of crop prediction is an annual climate adjustment of these overall long-term estimates, taking into account the expected effects on production of the previous year's climate. This adjustment is based on relative historical yields, measured as the percentage deviance between expected and actual production. The dominant climatic variables are observed temperature, evaporation, solar radiation and modelled water stress. Initially, a number of alternate statistical models showed good agreement within the historical data, with jack-knife cross-validation R2 values of 96% or better. However, forecasts varied quite widely between these alternate models. Exploratory multivariate analyses and nearest-neighbour methods were used to investigate these differences. For 2001-2003, the overall forecasts were in the right direction (when compared with the long-term expected values), but were over-estimates. In 2004 the forecast was well under the observed production, and in 2005 the revised models produced a forecast within 5.1% of the actual production. Over the first five years of forecasting, the absolute deviance for the climate-adjustment models averaged 10.1%, just outside the targeted objective of 10%.
Resumo:
Ultrafine powders of SrTiO3 are prepared at 100–150°C by the hydrothermal method, starting from TiO2·xH2O gel and Sr(OH)2 and H2O-isopropanol mixed solvent as the medium, The X-ray diffractograms of the powder show line broadening. The minimum crystallite size obtained ranges from 5 to 20nm with 20% H2O-80% C3H7OH as the reaction medium, as estimated from X-ray half-peak widths and TEM studies. The electron diffraction results indicate high concentration of lattice defects in these crystallites. The optical spectra of the particle suspensions in water show that the absorption around the band gap is considerably broadened, together with the appearance of maxima in the far ultraviolet. Aqueous suspensions of SrTiO3 powders, as such, do not produce H2 or O2 on UV irradiation. After coating with rhodium, H2 and O2 are evolved on illumination. However, the turn over number of O2 is lower than the stoichiometrically expected values from the corresponding values of H2. No correlation of the photocatalytic activity with surface area is observed. The activity of Rh-SrTiO3 slowly deteriorates with extended period of irradiation.
Resumo:
This thesis consists of an introduction, four research articles and an appendix. The thesis studies relations between two different approaches to continuum limit of models of two dimensional statistical mechanics at criticality. The approach of conformal field theory (CFT) could be thought of as the algebraic classification of some basic objects in these models. It has been succesfully used by physicists since 1980's. The other approach, Schramm-Loewner evolutions (SLEs), is a recently introduced set of mathematical methods to study random curves or interfaces occurring in the continuum limit of the models. The first and second included articles argue on basis of statistical mechanics what would be a plausible relation between SLEs and conformal field theory. The first article studies multiple SLEs, several random curves simultaneously in a domain. The proposed definition is compatible with a natural commutation requirement suggested by Dubédat. The curves of multiple SLE may form different topological configurations, ``pure geometries''. We conjecture a relation between the topological configurations and CFT concepts of conformal blocks and operator product expansions. Example applications of multiple SLEs include crossing probabilities for percolation and Ising model. The second article studies SLE variants that represent models with boundary conditions implemented by primary fields. The most well known of these, SLE(kappa, rho), is shown to be simple in terms of the Coulomb gas formalism of CFT. In the third article the space of local martingales for variants of SLE is shown to carry a representation of Virasoro algebra. Finding this structure is guided by the relation of SLEs and CFTs in general, but the result is established in a straightforward fashion. This article, too, emphasizes multiple SLEs and proposes a possible way of treating pure geometries in terms of Coulomb gas. The fourth article states results of applications of the Virasoro structure to the open questions of SLE reversibility and duality. Proofs of the stated results are provided in the appendix. The objective is an indirect computation of certain polynomial expected values. Provided that these expected values exist, in generic cases they are shown to possess the desired properties, thus giving support for both reversibility and duality.
Resumo:
Planar curves arise naturally as interfaces between two regions of the plane. An important part of statistical physics is the study of lattice models. This thesis is about the interfaces of 2D lattice models. The scaling limit is an infinite system limit which is taken by letting the lattice mesh decrease to zero. At criticality, the scaling limit of an interface is one of the SLE curves (Schramm-Loewner evolution), introduced by Oded Schramm. This family of random curves is parametrized by a real variable, which determines the universality class of the model. The first and the second paper of this thesis study properties of SLEs. They contain two different methods to study the whole SLE curve, which is, in fact, the most interesting object from the statistical physics point of view. These methods are applied to study two symmetries of SLE: reversibility and duality. The first paper uses an algebraic method and a representation of the Virasoro algebra to find common martingales to different processes, and that way, to confirm the symmetries for polynomial expected values of natural SLE data. In the second paper, a recursion is obtained for the same kind of expected values. The recursion is based on stationarity of the law of the whole SLE curve under a SLE induced flow. The third paper deals with one of the most central questions of the field and provides a framework of estimates for describing 2D scaling limits by SLE curves. In particular, it is shown that a weak estimate on the probability of an annulus crossing implies that a random curve arising from a statistical physics model will have scaling limits and those will be well-described by Loewner evolutions with random driving forces.
Resumo:
The stochastic filtering has been in general an estimation of indirectly observed states given observed data. This means that one is discussing conditional expected values as being one of the most accurate estimation, given the observations in the context of probability space. In my thesis, I have presented the theory of filtering using two different kind of observation process: the first one is a diffusion process which is discussed in the first chapter, while the third chapter introduces the latter which is a counting process. The majority of the fundamental results of the stochastic filtering is stated in form of interesting equations, such the unnormalized Zakai equation that leads to the Kushner-Stratonovich equation. The latter one which is known also by the normalized Zakai equation or equally by Fujisaki-Kallianpur-Kunita (FKK) equation, shows the divergence between the estimate using a diffusion process and a counting process. I have also introduced an example for the linear gaussian case, which is mainly the concept to build the so-called Kalman-Bucy filter. As the unnormalized and the normalized Zakai equations are in terms of the conditional distribution, a density of these distributions will be developed through these equations and stated by Kushner Theorem. However, Kushner Theorem has a form of a stochastic partial differential equation that needs to be verify in the sense of the existence and uniqueness of its solution, which is covered in the second chapter.