733 resultados para Smoothed bootstrap


Relevância:

10.00% 10.00%

Publicador:

Resumo:

(ABR) is of fundamental importance to the investiga- tion of the auditory system behavior, though its in- terpretation has a subjective nature because of the manual process employed in its study and the clinical experience required for its analysis. When analyzing the ABR, clinicians are often interested in the identi- fication of ABR signal components referred to as Jewett waves. In particular, the detection and study of the time when these waves occur (i.e., the wave la- tency) is a practical tool for the diagnosis of disorders affecting the auditory system. In this context, the aim of this research is to compare ABR manual/visual analysis provided by different examiners. Methods: The ABR data were collected from 10 normal-hearing subjects (5 men and 5 women, from 20 to 52 years). A total of 160 data samples were analyzed and a pair- wise comparison between four distinct examiners was executed. We carried out a statistical study aiming to identify significant differences between assessments provided by the examiners. For this, we used Linear Regression in conjunction with Bootstrap, as a me- thod for evaluating the relation between the responses given by the examiners. Results: The analysis sug- gests agreement among examiners however reveals differences between assessments of the variability of the waves. We quantified the magnitude of the ob- tained wave latency differences and 18% of the inves- tigated waves presented substantial differences (large and moderate) and of these 3.79% were considered not acceptable for the clinical practice. Conclusions: Our results characterize the variability of the manual analysis of ABR data and the necessity of establishing unified standards and protocols for the analysis of these data. These results may also contribute to the validation and development of automatic systems that are employed in the early diagnosis of hearing loss.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article examines the potential to improve numerical weather prediction (NWP) by estimating upper and lower bounds on predictability by re-visiting the original study of Lorenz (1982) but applied to the most recent version of the European Centre for Medium Range Weather Forecasts (ECMWF) forecast system, for both the deterministic and ensemble prediction systems (EPS). These bounds are contrasted with an older version of the same NWP system to see how they have changed with improvements to the NWP system. The computations were performed for the earlier seasons of DJF 1985/1986 and JJA 1986 and the later seasons of DJF 2010/2011 and JJA 2011 using the 500-hPa geopotential height field. Results indicate that for this field, we may be approaching the limit of deterministic forecasting so that further improvements might only be obtained by improving the initial state. The results also show that predictability calculations with earlier versions of the model may overestimate potential forecast skill, which may be due to insufficient internal variability in the model and because recent versions of the model are more realistic in representing the true atmospheric evolution. The same methodology is applied to the EPS to calculate upper and lower bounds of predictability of the ensemble mean forecast in order to explore how ensemble forecasting could extend the limits of the deterministic forecast. The results show that there is a large potential to improve the ensemble predictions, but for the increased predictability of the ensemble mean, there will be a trade-off in information as the forecasts will become increasingly smoothed with time. From around the 10-d forecast time, the ensemble mean begins to converge towards climatology. Until this point, the ensemble mean is able to predict the main features of the large-scale flow accurately and with high consistency from one forecast cycle to the next. By the 15-d forecast time, the ensemble mean has lost information with the anomaly of the flow strongly smoothed out. In contrast, the control forecast is much less consistent from run to run, but provides more detailed (unsmoothed) but less useful information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sea surface temperature (SST) can be estimated from day and night observations of the Spinning Enhanced Visible and Infra-Red Imager (SEVIRI) by optimal estimation (OE). We show that exploiting the 8.7 μm channel, in addition to the “traditional” wavelengths of 10.8 and 12.0 μm, improves OE SST retrieval statistics in validation. However, the main benefit is an improvement in the sensitivity of the SST estimate to variability in true SST. In a fair, single-pixel comparison, the 3-channel OE gives better results than the SST estimation technique presently operational within the Ocean and Sea Ice Satellite Application Facility. This operational technique is to use SST retrieval coefficients, followed by a bias-correction step informed by radiative transfer simulation. However, the operational technique has an additional “atmospheric correction smoothing”, which improves its noise performance, and hitherto had no analogue within the OE framework. Here, we propose an analogue to atmospheric correction smoothing, based on the expectation that atmospheric total column water vapour has a longer spatial correlation length scale than SST features. The approach extends the observations input to the OE to include the averaged brightness temperatures (BTs) of nearby clear-sky pixels, in addition to the BTs of the pixel for which SST is being retrieved. The retrieved quantities are then the single-pixel SST and the clear-sky total column water vapour averaged over the vicinity of the pixel. This reduces the noise in the retrieved SST significantly. The robust standard deviation of the new OE SST compared to matched drifting buoys becomes 0.39 K for all data. The smoothed OE gives SST sensitivity of 98% on average. This means that diurnal temperature variability and ocean frontal gradients are more faithfully estimated, and that the influence of the prior SST used is minimal (2%). This benefit is not available using traditional atmospheric correction smoothing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A steady decline in Arctic sea ice has been observed over recent decades. General circulation models predict further decreases under increasing greenhouse gas scenarios. Sea ice plays an important role in the climate system in that it influences ocean-to-atmosphere fluxes, surface albedo, and ocean buoyancy. The aim of this study is to isolate the climate impacts of a declining Arctic sea ice cover during the current century. The Hadley Centre Atmospheric Model (HadAM3) is forced with observed sea ice from 1980 to 2000 (obtained from satellite passive microwave radiometer data derived with the Bootstrap algorithm) and predicted sea ice reductions until 2100 under one moderate scenario and one severe scenario of ice decline, with a climatological SST field and increasing SSTs. Significant warming of the Arctic occurs during the twenty-first century (mean increase of between 1.6° and 3.9°C), with positive anomalies of up to 22°C locally. The majority of this is over ocean and limited to high latitudes, in contrast to recent observations of Northern Hemisphere warming. When a climatological SST field is used, statistically significant impacts on climate are only seen in winter, despite prescribing sea ice reductions in all months. When correspondingly increasing SSTs are incorporated, changes in climate are seen in both winter and summer, although the impacts in summer are much smaller. Alterations in atmospheric circulation and precipitation patterns are more widespread than temperature, extending down to midlatitude storm tracks. Results suggest that areas of Arctic land ice may even undergo net accumulation due to increased precipitation that results from loss of sea ice. Intensification of storm tracks implies that parts of Europe may experience higher precipitation rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several continuous observational datasets of Artic sea-ice concentration are currently available that cover the period since the advent of routine satellite observations. We report on a comparison of three sea-ice concentration datasets. These are the National Ice Center charts, and two passive microwave radiometer datasets derived using different approaches: the NASA team and Bootstrap algorithms. Empirical orthogonal function (EOF) analyses were employed to compare modes of variability and their consistency between the datasets. The analysis was motivated by the need for a reliable, realistic sea ice climatology for use in climate model simulations, for which both the variability and absolute values of extent and concentration are important. We found that, while there are significant discrepancies in absolute concentrations, the major modes of variability derived from all records were essentially the same.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents single-column model (SCM) simulations of a tropical squall-line case observed during the Coupled Ocean-Atmosphere Response Experiment of the Tropical Ocean/Global Atmosphere Programme. This case-study was part of an international model intercomparison project organized by Working Group 4 ‘Precipitating Convective Cloud Systems’ of the GEWEX (Global Energy and Water-cycle Experiment) Cloud System Study. Eight SCM groups using different deep-convection parametrizations participated in this project. The SCMs were forced by temperature and moisture tendencies that had been computed from a reference cloud-resolving model (CRM) simulation using open boundary conditions. The comparison of the SCM results with the reference CRM simulation provided insight into the ability of current convection and cloud schemes to represent organized convection. The CRM results enabled a detailed evaluation of the SCMs in terms of the thermodynamic structure and the convective mass flux of the system, the latter being closely related to the surface convective precipitation. It is shown that the SCMs could reproduce reasonably well the time evolution of the surface convective and stratiform precipitation, the convective mass flux, and the thermodynamic structure of the squall-line system. The thermodynamic structure simulated by the SCMs depended on how the models partitioned the precipitation between convective and stratiform. However, structural differences persisted in the thermodynamic profiles simulated by the SCMs and the CRM. These differences could be attributed to the fact that the total mass flux used to compute the SCM forcing differed from the convective mass flux. The SCMs could not adequately represent these organized mesoscale circulations and the microphysicallradiative forcing associated with the stratiform region. This issue is generally known as the ‘scale-interaction’ problem that can only be properly addressed in fully three-dimensional simulations. Sensitivity simulations run by several groups showed that the time evolution of the surface convective precipitation was considerably smoothed when the convective closure was based on convective available potential energy instead of moisture convergence. Finally, additional SCM simulations without using a convection parametrization indicated that the impact of a convection parametrization in forced SCM runs was more visible in the moisture profiles than in the temperature profiles because convective transport was particularly important in the moisture budget.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Methods of improving the coverage of Box–Jenkins prediction intervals for linear autoregressive models are explored. These methods use bootstrap techniques to allow for parameter estimation uncertainty and to reduce the small-sample bias in the estimator of the models’ parameters. In addition, we also consider a method of bias-correcting the non-linear functions of the parameter estimates that are used to generate conditional multi-step predictions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a benchmark system for global vegetation models. This system provides a quantitative evaluation of multiple simulated vegetation properties, including primary production; seasonal net ecosystem production; vegetation cover; composition and height; fire regime; and runoff. The benchmarks are derived from remotely sensed gridded datasets and site-based observations. The datasets allow comparisons of annual average conditions and seasonal and inter-annual variability, and they allow the impact of spatial and temporal biases in means and variability to be assessed separately. Specifically designed metrics quantify model performance for each process, and are compared to scores based on the temporal or spatial mean value of the observations and a "random" model produced by bootstrap resampling of the observations. The benchmark system is applied to three models: a simple light-use efficiency and water-balance model (the Simple Diagnostic Biosphere Model: SDBM), the Lund-Potsdam-Jena (LPJ) and Land Processes and eXchanges (LPX) dynamic global vegetation models (DGVMs). In general, the SDBM performs better than either of the DGVMs. It reproduces independent measurements of net primary production (NPP) but underestimates the amplitude of the observed CO2 seasonal cycle. The two DGVMs show little difference for most benchmarks (including the inter-annual variability in the growth rate and seasonal cycle of atmospheric CO2), but LPX represents burnt fraction demonstrably more accurately. Benchmarking also identified several weaknesses common to both DGVMs. The benchmarking system provides a quantitative approach for evaluating how adequately processes are represented in a model, identifying errors and biases, tracking improvements in performance through model development, and discriminating among models. Adoption of such a system would do much to improve confidence in terrestrial model predictions of climate change impacts and feedbacks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We compare and contrast the accuracy and uncertainty in forecasts of rents with those for a variety of macroeconomic series. The results show that in general forecasters tend to be marginally more accurate in the case of macro-economic series than with rents. In common across all of the series, forecasts tend to be smoothed with forecasters under-estimating performance during economic booms, and vice-versa in recessions We find that property forecasts are affected by economic uncertainty, as measured by disagreement across the macro-forecasters. Increased uncertainty leads to increased dispersion in the rental forecasts and a reduction in forecast accuracy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliable evidence of trends in the illegal ivory trade is important for informing decision making for elephants but it is difficult to obtain due to the covert nature of the trade. The Elephant Trade Information System, a global database of reported seizures of illegal ivory, holds the only extensive information on illicit trade available. However inherent biases in seizure data make it difficult to infer trends; countries differ in their ability to make and report seizures and these differences cannot be directly measured. We developed a new modelling framework to provide quantitative evidence on trends in the illegal ivory trade from seizures data. The framework used Bayesian hierarchical latent variable models to reduce bias in seizures data by identifying proxy variables that describe the variability in seizure and reporting rates between countries and over time. Models produced bias-adjusted smoothed estimates of relative trends in illegal ivory activity for raw and worked ivory in three weight classes. Activity is represented by two indicators describing the number of illegal ivory transactions--Transactions Index--and the total weight of illegal ivory transactions--Weights Index--at global, regional or national levels. Globally, activity was found to be rapidly increasing and at its highest level for 16 years, more than doubling from 2007 to 2011 and tripling from 1998 to 2011. Over 70% of the Transactions Index is from shipments of worked ivory weighing less than 10 kg and the rapid increase since 2007 is mainly due to increased consumption in China. Over 70% of the Weights Index is from shipments of raw ivory weighing at least 100 kg mainly moving from Central and East Africa to Southeast and East Asia. The results tie together recent findings on trends in poaching rates, declining populations and consumption and provide detailed evidence to inform international decision making on elephants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent studies of the variation of geomagnetic activity over the past 140 years have quantified the "coronal source" or "open" magnetic flux F-s that leaves the solar atmosphere and enters the heliosphere and have shown that it has risen, on average, by 34% since 1963 and by 140% since 1900. This variation is reflected in studies of the heliospheric field using isotopes deposited in ice sheets and meteorites by the action of galactic comic rays. The variation has also been reproduced using a model that demonstrates how the open flux accumulates and decays, depending on the rate of flux emergence in active regions and on the length of the solar cycle. The cosmic ray flux at energies > 3 GeV is found to have decayed by about 15% during the 20(th) century (and by about 4% at > 13 GeV). We show that the changes in the open flux do reflect changes in the photospheric and sub-surface field which offers an explanation of why open flux appears to be a good proxy for solar irradiance extrapolation. Correlations between F-s, solar cycle length, L, and 11-year smoothed sunspot number, R-11, explain why the various irradiance reconstructions for the last 150 years are similar in form. Possible implications of the inferred changes in cosmic ray flux and irradiance for global temperatures on Earth are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Sun-Earth connection is studied using long-term measurements from the Sun and from the Earth. The auroral activity is shown to correlate to high accuracy with the smoothed sunspot numbers. Similarly, both geomagnetic activity and global surface temperature anomaly can be linked to cyclic changes in the solar activity. The interlinked variations in the solar magnetic activity and in the solar irradiance cause effects that can be observed both in the Earth's biosphere and in the electromagnetic environment. The long-term data sets suggest that the increase in geomagnetic activity and surface temperatures are related (at least partially) to longer-term solar variations, which probably include an increasing trend superposed with a cyclic behavior with a period of about 90 years.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present an analysis of the accuracy of the method introduced by Lockwood et al. (1994) for the determination of the magnetopause reconnection rate from the dispersion of precipitating ions in the ionospheric cusp region. Tests are made by applying the method to synthesised data. The simulated cusp ion precipitation data are produced by an analytic model of the evolution of newly-opened field lines, along which magnetosheath ions are firstly injected across the magnetopause and then dispersed as they propagate into the ionosphere. The rate at which these newly opened field lines are generated by reconnection can be varied. The derived reconnection rate estimates are then compared with the input variation to the model and the accuracy of the method assessed. Results are presented for steady-state reconnection, for continuous reconnection showing a sine-wave variation in rate and for reconnection which only occurs in square wave pulses. It is found that the method always yields the total flux reconnected (per unit length of the open-closed field-line boundary) to within an accuracy of better than 5%, but that pulses tend to be smoothed so that the peak reconnection rate within the pulse is underestimated and the pulse length is overestimated. This smoothing is reduced if the separation between energy channels of the instrument is reduced; however this also acts to increase the experimental uncertainty in the estimates, an effect which can be countered by improving the time resolution of the observations. The limited time resolution of the data is shown to set a minimum reconnection rate below which the method gives spurious short-period oscillations about the true value. Various examples of reconnection rate variations derived from cusp observations are discussed in the light of this analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of zero-flow equilibria of the magnetosphere-ionosphere system leads to a large number of predictions concerning the ionospheric signatures of pulsed magnetopause reconnection. These include: poleward-moving F-region electron temperature enhancements and associated transient 630nm emission; associated poleward plasma flow which, compared to the pulsed variation of the reconnection rate, is highly smoothed by induction effects; oscillatory latitudinal motion of the open/closed field line boundary; phase lag of plasma flow enhancements after equatorward motions of the boundary; azimuthal plasma flow bursts, coincident in time and space with the 630nm-dominant auroral transients, only when the magnitude of the By component of the interplanetary magnetic field (IMF) is large; azimuthal-then-poleward motion of 630nm-dominant transients at a velocity which at all times equals the internal plasma flow velocity; 557.7nm-dominant transients on one edge of the 630nm-dominant transient (initially, and for large |By|, on the poleward or equatorward edge depending on the polarity of IMF By); tailward expansion of the flow response at several km s-1; and discrete steps in the cusp ion dispersion signature between the polewardmoving structures. This paper discusses these predictions and how all have recently been confirmed by combinations of observations by optical instruments on the Svalbard Islands, the EISCAT radars and the DMSP and DE satellites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The validity of ensemble averaging on event-related potential (ERP) data has been questioned, due to its assumption that the ERP is identical across trials. Thus, there is a need for preliminary testing for cluster structure in the data. New method: We propose a complete pipeline for the cluster analysis of ERP data. To increase the signalto-noise (SNR) ratio of the raw single-trials, we used a denoising method based on Empirical Mode Decomposition (EMD). Next, we used a bootstrap-based method to determine the number of clusters, through a measure called the Stability Index (SI). We then used a clustering algorithm based on a Genetic Algorithm (GA)to define initial cluster centroids for subsequent k-means clustering. Finally, we visualised the clustering results through a scheme based on Principal Component Analysis (PCA). Results: After validating the pipeline on simulated data, we tested it on data from two experiments – a P300 speller paradigm on a single subject and a language processing study on 25 subjects. Results revealed evidence for the existence of 6 clusters in one experimental condition from the language processing study. Further, a two-way chi-square test revealed an influence of subject on cluster membership.