185 resultados para Mean-variance.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Empirical orthogonal functions (EOFs) are widely used in climate research to identify dominant patterns of variability and to reduce the dimensionality of climate data. EOFs, however, can be difficult to interpret. Rotated empirical orthogonal functions (REOFs) have been proposed as more physical entities with simpler patterns than EOFs. This study presents a new approach for finding climate patterns with simple structures that overcomes the problems encountered with rotation. The method achieves simplicity of the patterns by using the main properties of EOFs and REOFs simultaneously. Orthogonal patterns that maximise variance subject to a constraint that induces a form of simplicity are found. The simplified empirical orthogonal function (SEOF) patterns, being more 'local'. are constrained to have zero loadings outside the main centre of action. The method is applied to winter Northern Hemisphere (NH) monthly mean sea level pressure (SLP) reanalyses over the period 1948-2000. The 'simplified' leading patterns of variability are identified and compared to the leading patterns obtained from EOFs and REOFs. Copyright (C) 2005 Royal Meteorological Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of fluctuating daily surface fluxes on the time-mean oceanic circulation is studied using an empirical flux model. The model produces fluctuating fluxes resulting from atmospheric variability and includes oceanic feedbacks on the fluxes. Numerical experiments were carried out by driving an ocean general circulation model with three different versions of the empirical model. It is found that fluctuating daily fluxes lead to an increase in the meridional overturning circulation (MOC) of the Atlantic of about 1 Sv and a decrease in the Antarctic circumpolar current (ACC) of about 32 Sv. The changes are approximately 7% of the MOC and 16% of the ACC obtained without fluctuating daily fluxes. The fluctuating fluxes change the intensity and the depth of vertical mixing. This, in turn, changes the density field and thus the circulation. Fluctuating buoyancy fluxes change the vertical mixing in a non-linear way: they tend to increase the convective mixing in mostly stable regions and to decrease the convective mixing in mostly unstable regions. The ACC changes are related to the enhanced mixing in the subtropical and the mid-latitude Southern Ocean and reduced mixing in the high-latitude Southern Ocean. The enhanced mixing is related to an increase in the frequency and the depth of convective events. As these events bring more dense water downward, the mixing changes lead to a reduction in meridional gradient of the depth-integrated density in the Southern Ocean and hence the strength of the ACC. The MOC changes are related to more subtle density changes. It is found that the vertical mixing in a latitudinal strip in the northern North Atlantic is more strongly enhanced due to fluctuating fluxes than the mixing in a latitudinal strip in the South Atlantic. This leads to an increase in the density difference between the two strips, which can be responsible for the increase in the Atlantic MOC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have previously placed the solar contribution to recent global warming in context using observations and without recourse to climate models. It was shown that all solar forcings of climate have declined since 1987. The present paper extends that analysis to include the effects of the various time constants with which the Earth’s climate system might react to solar forcing. The solar input waveform over the past 100 years is defined using observed and inferred galactic cosmic ray fluxes, valid for either a direct effect of cosmic rays on climate or an effect via their known correlation with total solar irradiance (TSI), or for a combination of the two. The implications, and the relative merits, of the various TSI composite data series are discussed and independent tests reveal that the PMOD composite used in our previous paper is the most realistic. Use of the ACRIM composite, which shows a rise in TSI over recent decades, is shown to be inconsistent with most published evidence for solar influences on pre-industrial climate. The conclusions of our previous paper, that solar forcing has declined over the past 20 years while surface air temperatures have continued to rise, are shown to apply for the full range of potential time constants for the climate response to the variations in the solar forcings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A multivariate fit to the variation in global mean surface air temperature anomaly over the past half century is presented. The fit procedure allows for the effect of response time on the waveform, amplitude and lag of each radiative forcing input, and each is allowed to have its own time constant. It is shown that the contribution of solar variability to the temperature trend since 1987 is small and downward; the best estimate is -1.3% and the 2sigma confidence level sets the uncertainty range of -0.7 to -1.9%. The result is the same if one quantifies the solar variation using galactic cosmic ray fluxes (for which the analysis can be extended back to 1953) or the most accurate total solar irradiance data composite. The rise in the global mean air surface temperatures is predominantly associated with a linear increase that represents the combined effects of changes in anthropogenic well-mixed greenhouse gases and aerosols, although, in recent decades, there is also a considerable contribution by a relative lack of major volcanic eruptions. The best estimate is that the anthropogenic factors contribute 75% of the rise since 1987, with an uncertainty range (set by the 2sigma confidence level using an AR(1) noise model) of 49–160%; thus, the uncertainty is large, but we can state that at least half of the temperature trend comes from the linear term and that this term could explain the entire rise. The results are consistent with the intergovernmental panel on climate change (IPCC) estimates of the changes in radiative forcing (given for 1961–1995) and are here combined with those estimates to find the response times, equilibrium climate sensitivities and pertinent heat capacities (i.e. the depth into the oceans to which a given radiative forcing variation penetrates) of the quasi-periodic (decadal-scale) input forcing variations. As shown by previous studies, the decadal-scale variations do not penetrate as deeply into the oceans as the longer term drifts and have shorter response times. Hence, conclusions about the response to century-scale forcing changes (and hence the associated equilibrium climate sensitivity and the temperature rise commitment) cannot be made from studies of the response to shorter period forcing changes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A powerful way to test the realism of ocean general circulation models is to systematically compare observations of passive tracer concentration with model predictions. The general circulation models used in this way cannot resolve a full range of vigorous mesoscale activity (on length scales between 10–100 km). In the real ocean, however, this activity causes important variability in tracer fields. Thus, in order to rationally compare tracer observations with model predictions these unresolved fluctuations (the model variability error) must be estimated. We have analyzed this variability using an eddy‐resolving reduced‐gravity model in a simple midlatitude double‐gyre configuration. We find that the wave number spectrum of tracer variance is only weakly sensitive to the distribution of (large scale slowly varying) tracer sources and sinks. This suggests that a universal passive tracer spectrum may exist in the ocean. We estimate the spectral shape using high‐resolution measurements of potential temperature on an isopycnal in the upper northeast Atlantic Ocean, finding a slope near k −1.7 between 10 and 500 km. The typical magnitude of the variance is estimated by comparing tracer simulations using different resolutions. For CFC‐ and tritium‐type transient tracers the peak magnitude of the model variability saturation error may reach 0.20 for scales shorter than 100 km. This is of the same order as the time mean saturation itself and well over an order of magnitude greater than the instrumental uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the applications of capture–recapture methods to human populations. Capture–recapture methods are commonly used in estimating the size of wildlife populations but can also be used in epidemiology and social sciences, for estimating prevalence of a particular disease or the size of the homeless population in a certain area. Here we focus on estimating the prevalence of infectious diseases. Several estimators of population size are considered: the Lincoln–Petersen estimator and its modified version, the Chapman estimator, Chao’s lower bound estimator, the Zelterman’s estimator, McKendrick’s moment estimator and the maximum likelihood estimator. In order to evaluate these estimators, they are applied to real, three-source, capture-recapture data. By conditioning on each of the sources of three source data, we have been able to compare the estimators with the true value that they are estimating. The Chapman and Chao estimators were compared in terms of their relative bias. A variance formula derived through conditioning is suggested for Chao’s estimator, and normal 95% confidence intervals are calculated for this and the Chapman estimator. We then compare the coverage of the respective confidence intervals. Furthermore, a simulation study is included to compare Chao’s and Chapman’s estimator. Results indicate that Chao’s estimator is less biased than Chapman’s estimator unless both sources are independent. Chao’s estimator has also the smaller mean squared error. Finally, the implications and limitations of the above methods are discussed, with suggestions for further development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This note considers the variance estimation for population size estimators based on capture–recapture experiments. Whereas a diversity of estimators of the population size has been suggested, the question of estimating the associated variances is less frequently addressed. This note points out that the technique of conditioning can be applied here successfully which also allows us to identify sources of variation: the variance due to estimation of the model parameters and the binomial variance due to sampling n units from a population of size N. It is applied to estimators typically used in capture–recapture experiments in continuous time including the estimators of Zelterman and Chao and improves upon previously used variance estimators. In addition, knowledge of the variances associated with the estimators by Zelterman and Chao allows the suggestion of a new estimator as the weighted sum of the two. The decomposition of the variance into the two sources allows also a new understanding of how resampling techniques like the Bootstrap could be used appropriately. Finally, the sample size question for capture–recapture experiments is addressed. Since the variance of population size estimators increases with the sample size, it is suggested to use relative measures such as the observed-to-hidden ratio or the completeness of identification proportion for approaching the question of sample size choice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The importance of temperature in the determination of the yield of an annual crop (groundnut; Arachis hypogaea L. in India) was assessed. Simulations from a regional climate model (PRECIS) were used with a crop model (GLAM) to examine crop growth under simulated current (1961-1990) and future (2071-2100) climates. Two processes were examined: the response of crop duration to mean temperature and the response of seed-set to extremes of temperature. The relative importance of, and interaction between, these two processes was examined for a number of genotypic characteristics, which were represented by using different values of crop model parameters derived from experiments. The impact of mean and extreme temperatures varied geographically, and depended upon the simulated genotypic properties. High temperature stress was not a major determinant of simulated yields in the current climate, but affected the mean and variability of yield under climate change in two regions which had contrasting statistics of daily maximum temperature. Changes in mean temperature had a similar impact on mean yield to that of high temperature stress in some locations and its effects were more widespread. Where the optimal temperature for development was exceeded, the resulting increase in duration in some simulations fully mitigated the negative impacts of extreme temperatures when sufficient water was available for the extended growing period. For some simulations the reduction in mean yield between the current and future climates was as large as 70%, indicating the importance of genotypic adaptation to changes in both means and extremes of temperature under climate change. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The farm-level success of Bt-cotton in developing countries is well documented. However, the literature has only recently begun to recognise the importance of accounting for the effects of the technology on production risk, in addition to the mean effect estimated by previous studies. The risk effects of the technology are likely very important to smallholder farmers in the developing world due to their risk-aversion. We advance the emergent literature on Bt-cotton and production risk by using panel data methods to control for possible endogeneity of Bt-adoption. We estimate two models, the first a fixed-effects version of the Just and Pope model with additive individual and time effects, and the second a variation of the model in which inputs and variety choice are allowed to affect the variance of the time effect and its correlation with the idiosyncratic error. The models are applied to panel data on smallholder cotton production in India and South Africa. Our results suggest a risk-reducing effect of Bt-cotton in India, but an inconclusive picture in South Africa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Based on the potential benefits of cis-9, trans-11 conjugated linoleic acid (CLA) for human health, there is a need to develop effective strategies for enhancing milk fat CLA concentrations. Levels of cis-9, trans-11 CLA in milk can be increased by supplements of fish oil (FO) and sunflower oil (SO), but there is considerable variation in the response. Part of this variance may reflect time-dependent ruminal adaptations to high levels of lipid in the diet, which lead to alterations in the formation of specific biohydrogenation intermediates. To test this hypothesis, 16 late lactation Holstein-British Friesian cows were used in a repeated measures randomized block design to examine milk fatty acid composition responses to FO and SO in the diet over a 28-d period. Cows were allocated at random to corn silage-based rations (8 per treatment) containing 0 (control) or 45 g of oil supplement/ kg of dry matter consisting (1:2; wt/wt) of FO and SO (FSO), and milk composition was determined on alternate days from d 1. Compared with the control, the FSO diet decreased mean dry matter intake (21.1 vs. 17.9 kg/d), milk fat (47.7 vs. 32.6 g/kg), and protein content (36.1 vs. 33.3 g/kg), but had no effect on milk yield (27.1 vs. 26.4 kg/d). Reductions in milk fat content relative to the FSO diet were associated with increases in milk trans-10 18: 1, trans-10, cis-12 CLA, and trans-9, cis-11 CLA concentrations (r(2) = 0.74, 0.57, and 0.80, respectively). Compared with the control, the FSO diet reduced milk 4: 0 to 18: 0 and cis 18:1 content and increased trans 18:1, trans 18:2, cis-9, trans-11 CLA, 20: 5 n-3, and 22: 6 n-3 concentrations. The FSO diet caused a rapid elevation in milk cis-9, trans-11 CLA content, reaching a maximum of 5.37 g/100 g of fatty acids on d 5, but these increases were transient, declining to 2.35 g/100 g of fatty acids by d 15. They remained relatively constant thereafter. Even though concentrations of trans-11 18: 1 followed the same pattern of temporal changes as cis-9, trans-11 CLA, the total trans 18:1 content of FSO milk was unchanged because of the concomitant increases in the concentration of other isomers (Delta(4-10) and Delta(12-15)), predominantely trans-10 18:1. In conclusion, supplementing diets with FSO enhances milk fat cis-9, trans-11 CLA content, but the high level of enrichment declines because of changes in ruminal biohydrogenation that result in trans-10 replacing trans-11 as the major 18:1 biohydrogenation intermediate formed in the rumen.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proportion estimators are quite frequently used in many application areas. The conventional proportion estimator (number of events divided by sample size) encounters a number of problems when the data are sparse as will be demonstrated in various settings. The problem of estimating its variance when sample sizes become small is rarely addressed in a satisfying framework. Specifically, we have in mind applications like the weighted risk difference in multicenter trials or stratifying risk ratio estimators (to adjust for potential confounders) in epidemiological studies. It is suggested to estimate p using the parametric family (see PDF for character) and p(1 - p) using (see PDF for character), where (see PDF for character). We investigate the estimation problem of choosing c 0 from various perspectives including minimizing the average mean squared error of (see PDF for character), average bias and average mean squared error of (see PDF for character). The optimal value of c for minimizing the average mean squared error of (see PDF for character) is found to be independent of n and equals c = 1. The optimal value of c for minimizing the average mean squared error of (see PDF for character) is found to be dependent of n with limiting value c = 0.833. This might justifiy to use a near-optimal value of c = 1 in practice which also turns out to be beneficial when constructing confidence intervals of the form (see PDF for character).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The jackknife method is often used for variance estimation in sample surveys but has only been developed for a limited class of sampling designs.We propose a jackknife variance estimator which is defined for any without-replacement unequal probability sampling design. We demonstrate design consistency of this estimator for a broad class of point estimators. A Monte Carlo study shows how the proposed estimator may improve on existing estimators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is common practice to design a survey with a large number of strata. However, in this case the usual techniques for variance estimation can be inaccurate. This paper proposes a variance estimator for estimators of totals. The method proposed can be implemented with standard statistical packages without any specific programming, as it involves simple techniques of estimation, such as regression fitting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The systematic sampling (SYS) design (Madow and Madow, 1944) is widely used by statistical offices due to its simplicity and efficiency (e.g., Iachan, 1982). But it suffers from a serious defect, namely, that it is impossible to unbiasedly estimate the sampling variance (Iachan, 1982) and usual variance estimators (Yates and Grundy, 1953) are inadequate and can overestimate the variance significantly (Särndal et al., 1992). We propose a novel variance estimator which is less biased and that can be implemented with any given population order. We will justify this estimator theoretically and with a Monte Carlo simulation study.