963 resultados para Age, lower confidence level


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Changes to stratospheric sudden warmings (SSWs) over the coming century, as predicted by the Geophysical Fluid Dynamics Laboratory (GFDL) chemistry climate model [Atmospheric Model With Transport and Chemistry (AMTRAC)], are investigated in detail. Two sets of integrations, each a three-member ensemble, are analyzed. The first set is driven with observed climate forcings between 1960 and 2004; the second is driven with climate forcings from a coupled model run, including trace gas concentrations representing a midrange estimate of future anthropogenic emissions between 1990 and 2099. A small positive trend in the frequency of SSWs is found. This trend, amounting to 1 event/decade over a century, is statistically significant at the 90% confidence level and is consistent over the two sets of model integrations. Comparison of the model SSW climatology between the late 20th and 21st centuries shows that the increase is largest toward the end of the winter season. In contrast, the dynamical properties are not significantly altered in the coming century, despite the increase in SSW frequency. Owing to the intrinsic complexity of our model, the direct cause of the predicted trend in SSW frequency remains an open question.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The skill of numerical Lagrangian drifter trajectories in three numerical models is assessed by comparing these numerically obtained paths to the trajectories of drifting buoys in the real ocean. The skill assessment is performed using the two-sample Kolmogorov–Smirnov statistical test. To demonstrate the assessment procedure, it is applied to three different models of the Agulhas region. The test can either be performed using crossing positions of one-dimensional sections in order to test model performance in specific locations, or using the total two-dimensional data set of trajectories. The test yields four quantities: a binary decision of model skill, a confidence level which can be used as a measure of goodness-of-fit of the model, a test statistic which can be used to determine the sensitivity of the confidence level, and cumulative distribution functions that aid in the qualitative analysis. The ordering of models by their confidence levels is the same as the ordering based on the qualitative analysis, which suggests that the method is suited for model validation. Only one of the three models, a 1/10° two-way nested regional ocean model, might have skill in the Agulhas region. The other two models, a 1/2° global model and a 1/8° assimilative model, might have skill only on some sections in the region

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multivariate fit to the variation in global mean surface air temperature anomaly over the past half century is presented. The fit procedure allows for the effect of response time on the waveform, amplitude and lag of each radiative forcing input, and each is allowed to have its own time constant. It is shown that the contribution of solar variability to the temperature trend since 1987 is small and downward; the best estimate is -1.3% and the 2sigma confidence level sets the uncertainty range of -0.7 to -1.9%. The result is the same if one quantifies the solar variation using galactic cosmic ray fluxes (for which the analysis can be extended back to 1953) or the most accurate total solar irradiance data composite. The rise in the global mean air surface temperatures is predominantly associated with a linear increase that represents the combined effects of changes in anthropogenic well-mixed greenhouse gases and aerosols, although, in recent decades, there is also a considerable contribution by a relative lack of major volcanic eruptions. The best estimate is that the anthropogenic factors contribute 75% of the rise since 1987, with an uncertainty range (set by the 2sigma confidence level using an AR(1) noise model) of 49–160%; thus, the uncertainty is large, but we can state that at least half of the temperature trend comes from the linear term and that this term could explain the entire rise. The results are consistent with the intergovernmental panel on climate change (IPCC) estimates of the changes in radiative forcing (given for 1961–1995) and are here combined with those estimates to find the response times, equilibrium climate sensitivities and pertinent heat capacities (i.e. the depth into the oceans to which a given radiative forcing variation penetrates) of the quasi-periodic (decadal-scale) input forcing variations. As shown by previous studies, the decadal-scale variations do not penetrate as deeply into the oceans as the longer term drifts and have shorter response times. Hence, conclusions about the response to century-scale forcing changes (and hence the associated equilibrium climate sensitivity and the temperature rise commitment) cannot be made from studies of the response to shorter period forcing changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have combined several key sample preparation steps for the use of a liquid matrix system to provide high analytical sensitivity in automated ultraviolet -- matrix-assisted laser desorption/ionisation -- mass spectrometry (UV-MALDI-MS). This new sample preparation protocol employs a matrix-mixture which is based on the glycerol matrix-mixture described by Sze et al. The low-femtomole sensitivity that is achievable with this new preparation protocol enables proteomic analysis of protein digests comparable to solid-state matrix systems. For automated data acquisition and analysis, the MALDI performance of this liquid matrix surpasses the conventional solid-state MALDI matrices. Besides the inherent general advantages of liquid samples for automated sample preparation and data acquisition the use of the presented liquid matrix significantly reduces the extent of unspecific ion signals in peptide mass fingerprints compared to typically used solid matrices, such as 2,5-dihydroxybenzoic acid (DHB) or alpha-cyano-hydroxycinnamic acid (CHCA). In particular, matrix and low-mass ion signals and ion signals resulting from cation adduct formation are dramatically reduced. Consequently, the confidence level of protein identification by peptide mass mapping of in-solution and in-gel digests is generally higher.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have combined several key sample preparation steps for the use of a liquid matrix system to provide high analytical sensitivity in automated ultraviolet - matrix-assisted laser desorption/ ionisation - mass spectrometry (UV-MALDI-MS). This new sample preparation protocol employs a matrix-mixture which is based on the glycerol matrix-mixture described by Sze et al. U. Am. Soc. Mass Spectrom. 1998, 9, 166-174). The low-ferntomole sensitivity that is achievable with this new preparation protocol enables proteomic analysis of protein digests comparable to solid-state matrix systems. For automated data acquisition and analysis, the MALDI performance of this liquid matrix surpasses the conventional solid-state MALDI matrices. Besides the inherent general advantages of liquid samples for automated sample preparation and data acquisition the use of the presented liquid matrix significantly reduces the extent of unspecific ion signals in peptide mass fingerprints compared to typically used solid matrices, such as 2,5-dihydrox-ybenzoic acid (DHB) or alpha-cyano-hydroxycinnamic acid (CHCA). In particular, matrix and lowmass ion signals and ion signals resulting from cation adduct formation are dramatically reduced. Consequently, the confidence level of protein identification by peptide mass mapping of in-solution and in-gel digests is generally higher.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Global climate change results from a small yet persistent imbalance between the amount of sunlight absorbed by Earth and the thermal radiation emitted back to space. An apparent inconsistency has been diagnosed between interannual variations in the net radiation imbalance inferred from satellite measurements and upper-ocean heating rate from in situ measurements, and this inconsistency has been interpreted as ‘missing energy’ in the system. Here we present a revised analysis of net radiation at the top of the atmosphere from satellite data, and we estimate ocean heat content, based on three independent sources. We find that the difference between the heat balance at the top of the atmosphere and upper-ocean heat content change is not statistically significant when accounting for observational uncertainties in ocean measurements, given transitions in instrumentation and sampling. Furthermore, variability in Earth’s energy imbalance relating to El Niño-Southern Oscillation is found to be consistent within observational uncertainties among the satellite measurements, a reanalysis model simulation and one of the ocean heat content records. We combine satellite data with ocean measurements to depths of 1,800 m, and show that between January 2001 and December 2010, Earth has been steadily accumulating energy at a rate of 0.50±0.43 Wm−2 (uncertainties at the 90% confidence level). We conclude that energy storage is continuing to increase in the sub-surface ocean.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geophysical time series sometimes exhibit serial correlations that are stronger than can be captured by the commonly used first‐order autoregressive model. In this study we demonstrate that a power law statistical model serves as a useful upper bound for the persistence of total ozone anomalies on monthly to interannual timescales. Such a model is usually characterized by the Hurst exponent. We show that the estimation of the Hurst exponent in time series of total ozone is sensitive to various choices made in the statistical analysis, especially whether and how the deterministic (including periodic) signals are filtered from the time series, and the frequency range over which the estimation is made. In particular, care must be taken to ensure that the estimate of the Hurst exponent accurately represents the low‐frequency limit of the spectrum, which is the part that is relevant to long‐term correlations and the uncertainty of estimated trends. Otherwise, spurious results can be obtained. Based on this analysis, and using an updated equivalent effective stratospheric chlorine (EESC) function, we predict that an increase in total ozone attributable to EESC should be detectable at the 95% confidence level by 2015 at the latest in southern midlatitudes, and by 2020–2025 at the latest over 30°–45°N, with the time to detection increasing rapidly with latitude north of this range.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have incorporated a semi-mechanistic isoprene emission module into the JULES land-surface scheme, as a first step towards a modelling tool that can be applied for studies of vegetation – atmospheric chemistry interactions, including chemistry-climate feedbacks. Here, we evaluate the coupled model against local above-canopy isoprene emission flux measurements from six flux tower sites as well as satellite-derived estimates of isoprene emission over tropical South America and east and south Asia. The model simulates diurnal variability well: correlation coefficients are significant (at the 95 % level) for all flux tower sites. The model reproduces day-to-day variability with significant correlations (at the 95 % confidence level) at four of the six flux tower sites. At the UMBS site, a complete set of seasonal observations is available for two years (2000 and 2002). The model reproduces the seasonal pattern of emission during 2002, but does less well in the year 2000. The model overestimates observed emissions at all sites, which is partially because it does not include isoprene loss through the canopy. Comparison with the satellite-derived isoprene-emission estimates suggests that the model simulates the main spatial patterns, seasonal and inter-annual variability over tropical regions. The model yields a global annual isoprene emission of 535 ± 9 TgC yr−1 during the 1990s, 78 % of which from forested areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1 The recent increase in planting of selected willow clones as energy crops for biomass production has resulted in a need to understand the relationship between commonly grown, clonally propagated genotypes and their pests. 2 For the first time, we present a study of the interactions of six willow clones and a previously unconsidered pest, the giant willow aphid Tuberolachnus salignus. 3 Tuberolachnus salignus alatae displayed no preference between the clones, but there was genetic variation in resistance between the clones; Q83 was the most resistant and led to the lowest reproductive performance in the aphid 4 Maternal effects buffered changes in aphid performance. On four tested willow clones fecundity of first generation aphids on the new host clone was intermediate to that of the second generation and that of the clone used to maintain the aphids in culture. 5 In the field, patterns of aphid infestation were highly variable between years, with the duration of attack being up to four times longer in 1999. In both years there was a significant effect of willow clone on the intensity of infestation. However, whereas Orm had the lowest intensity of infestation in the first year, Dasyclados supported a lower population level than other monitored clones in the second year.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Decades of research attest that memory processes suffer under conditions of auditory distraction. What is however less well understood is whether people are able to modify how their memory processes are deployed in order to compensate for disruptive effects of distraction. The metacognitive approach to memory describes a variety of ways people can exert control over their cognitive processes to optimize performance. Here we describe our recent investigations into how these control processes change under conditions of auditory distraction. We specifically looked at control of encoding in the form of decisions about how long to study a word when it is presented and control of memory reporting in the form of decisions whether to volunteer or withhold retrieved details. Regarding control of encoding, we expected that people would compensate for disruptive effects of distraction by extending study time under noise. Our results revealed, however, that when exposed to irrelevant speech, people curtail rather than extend study. Regarding control of memory reporting, we expected that people would compensate for the loss of access to memory records by volunteering responses held with lower confidence. Our results revealed, however, that people’s reporting strategies do not differ when memory task is performed in silence or under auditory distraction, although distraction seriously undermines people’s confidence in their own responses. Together, our studies reveal novel avenues for investigating the psychological effects of auditory distraction within a metacognitive framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter presents findings on English Language instruction at the lower primary level in the context of policies for curricular innovation at national, school and classroom levels. The focus is on policies which connect national and school levels, and on how they might be interpreted when implemented in multiple schools within Singapore’s educational system. Referring to case studies in two schools and to individual lesson observations in 10 schools, we found much agreement with national policies in terms of curriculum (i.e. lesson content and activity selection),leading to great uniformity in the lessons taught by different teachers in different schools. In addition, we found that schools had an important mediating influence on implementation of national policies. However, adoptions and adaptations of policy innovations at the classroom level were somewhat superficial as they were more related to changes in educational facilities and procedures than in philosophies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The exhaust emission of the polycyclic aromatic hydrocarbons (PAHs) considered toxic to human health were investigated on two spark ignition light duty vehicles, one being gasohol (Gasohol, in Brazil, is the generic denomination for mixtures of pure gasoline plus 20-25% of anhydrous ethyl alcohol fuel (AEAF).)-fuelled and the other a flexible-fuel vehicle fuelled with hydrated ethanol. The influence of fuel type and quality, aged lubricant oil type and use of fuel additives on the formation of these compounds was tested using standardized tests identical to US FTP-75 cycle. PAH sampling and chemical analysis followed the basic recommendations of method TO-13 (United States. Environmental Protection Agency, 1999. Compendium Method TO-13A - Determination of polycyclic Aromatic hydrocarbons (PAH) in Ambient Air Using Gas Chromatography/Mass Spectrometry (CG/MS). Center for environmental research information, Cincinnati, p. 78), with the necessary modification for this particular application. Results showed that the total PAH emission factor varied from 41.9 mu g km(-1) to 612 mu g km(-1) in the gasohol vehicle, and from 11.7 mu g km(-1) to 27.4 mu g km(-1) in the ethanol-fuelled vehicle, a significant difference in favor of the ethanol vehicle. Generally, emission of light molecular weight PAHs was predominant, while high molecular weights PAHs were not detected. In terms of benzo(a)pyrene toxicity equivalence, emission factors varied from 0.00984 mu g TEQ km(-1) to 4.61 mu g TEQ km(-1) for the gasohol vehicle and from 0.0117 mu g TEQ km(-1) to 0.0218 mu g TEQ km(-1) in the ethanol vehicle. For the gasohol vehicle, results showed that the use of fuel additive causes a significant increase in the emission of naphthalene and phenanthrene at a confidence level of 90% or higher; the use of rubber solvent on gasohol showed a reduction in the emission of naphthalene and phenanthrene at the same confidence level; the use of synthetic oil instead of mineral oil also contributed significantly to a decrease in the emission of naphthalene and fluorene. In relation to the ethanol vehicle, the same factors were tested and showed no statistically significant influence on PAH emission. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is an assessment of frequency of extreme values (EVs) of daily rainfall in the city of Sao Paulo. Brazil, over the period 1933-2005, based on the peaks-over-threshold (POT) and Generalized Pareto Distribution (GPD) approach. Usually. a GPD model is fitted to a sample of POT Values Selected With a constant threshold. However. in this work we use time-dependent thresholds, composed of relatively large p quantities (for example p of 0.97) of daily rainfall amounts computed from all available data. Samples of POT values were extracted with several Values of p. Four different GPD models (GPD-1, GPD-2, GPD-3. and GDP-4) were fitted to each one of these samples by the maximum likelihood (ML) method. The shape parameter was assumed constant for the four models, but time-varying covariates were incorporated into scale parameter of GPD-2. GPD-3, and GPD-4, describing annual cycle in GPD-2. linear trend in GPD-3, and both annual cycle and linear trend in GPD-4. The GPD-1 with constant scale and shape parameters is the simplest model. For identification of the best model among the four models WC used rescaled Akaike Information Criterion (AIC) with second-order bias correction. This criterion isolates GPD-3 as the best model, i.e. the one with positive linear trend in the scale parameter. The slope of this trend is significant compared to the null hypothesis of no trend, for about 98% confidence level. The non-parametric Mann-Kendall test also showed presence of positive trend in the annual frequency of excess over high thresholds. with p-value being virtually zero. Therefore. there is strong evidence that high quantiles of daily rainfall in the city of Sao Paulo have been increasing in magnitude and frequency over time. For example. 0.99 quantiles of daily rainfall amount have increased by about 40 mm between 1933 and 2005. Copyright (C) 2008 Royal Meteorological Society

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The transition redshift (deceleration/acceleration) is discussed by expanding the deceleration parameter to first order around its present value. A detailed study is carried out by considering two different parametrizations, q = q(0) + q(1)z and q = q(0) + q(1)z(1 + z)(-1), and the associated free parameters (q(0), q(1)) are constrained by three different supernovae (SNe) samples. A previous analysis by Riess et al. using the first expansion is slightly improved and confirmed in light of their recent data (Gold07 sample). However, by fitting the model with the Supernova Legacy Survey (SNLS) type Ia sample, we find that the best fit to the redshift transition is z(t) = 0.61, instead of z(t) = 0.46 as derived by the High-z Supernovae Search (HZSNS) team. This result based in the SNLS sample is also in good agreement with the sample of Davis et al., z(t) = 0.60(-0.11)(+0.28) (1 sigma). Such results are in line with some independent analyses and accommodate more easily the concordance flat model (Lambda CDM). For both parametrizations, the three SNe Ia samples considered favour recent acceleration and past deceleration with a high degree of statistical confidence level. All the kinematic results presented here depend neither on the validity of general relativity nor on the matter-energy contents of the Universe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data collected by the Pierre Auger Observatory provide evidence for anisotropy in the arrival directions of the cosmic rays with the highest-energies, which are correlated with the positions of relatively nearby active galactic nuclei (AGN) [Pierre Auger Collaboration, Science 318 (2007) 938]. The correlation has maximum significance for cosmic rays with energy greater than similar to 6 x 10(19) eV and AGN at a distance less than similar to 75 Mpc. We have confirmed the anisotropy at a confidence level of more than 99% through a test with parameters specified a priori, using an independent data set. The observed correlation is compatible with the hypothesis that cosmic rays with the highest-energies originate from extra-galactic sources close enough so that their flux is not significantly attenuated by interaction with the cosmic background radiation (the Greisen-Zatsepin-Kuz`min effect). The angular scale of the correlation observed is a few degrees, which suggests a predominantly light composition unless the magnetic fields are very weak outside the thin disk of our galaxy. Our present data do not identify AGN as the sources of cosmic rays unambiguously, and other candidate sources which are distributed as nearby AGN are not ruled out. We discuss the prospect of unequivocal identification of individual sources of the highest-energy cosmic rays within a few years of continued operation of the Pierre Auger Observatory. (C) 2008 Elsevier B.V. All rights reserved.