971 resultados para Predictive Mean Squared Efficiency


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A lack of quantitative high resolution paleoclimate data from the Southern Hemisphere limits the ability to examine current trends within the context of long-term natural climate variability. This study presents a temperature reconstruction for southern Tasmania based on analyses of a sediment core from Duckhole Lake (43.365°S, 146.875°E). The relationship between non-destructive whole core scanning reflectance spectroscopy measurements in the visible spectrum (380–730 nm) and the instrumental temperature record (ad 1911–2000) was used to develop a calibration-in-time reflectance spectroscopy-based temperature model. Results showed that a trough in reflectance from 650 to 700 nm, which represents chlorophyll and its derivatives, was significantly correlated to annual mean temperature. A calibration model was developed (R = 0.56, p auto < 0.05, root mean squared error of prediction (RMSEP) = 0.21°C, five-year filtered data, calibration period 1911–2000) and applied down-core to reconstruct annual mean temperatures in southern Tasmania over the last c. 950 years. This indicated that temperatures were initially cool c. ad 1050, but steadily increased until the late ad 1100s. After a brief cool period in the ad 1200s, temperatures again increased. Temperatures steadily decreased during the ad 1600s and remained relatively stable until the start of the 20th century when they rapidly decreased, before increasing from ad 1960s onwards. Comparisons with high resolution temperature records from western Tasmania, New Zealand and South America revealed some similarities, but also highlighted differences in temperature variability across the mid-latitudes of the Southern Hemisphere. These are likely due to a combination of factors including the spatial variability in climate between and within regions, and differences between records that document seasonal (i.e. warm season/late summer) versus annual temperature variability. This highlights the need for further records from the mid-latitudes of the Southern Hemisphere in order to constrain past natural spatial and seasonal/annual temperature variability in the region, and to accurately identify and attribute changes to natural variability and/or anthropogenic activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-resolution, well-calibrated records of lake sediments are critically important for quantitative climate reconstructions, but they remain a methodological and analytical challenge. While several comprehensive paleotemperature reconstructions have been developed across Europe, only a few quantitative high-resolution studies exist for precipitation. Here we present a calibration and verification study of lithoclastic sediment proxies from proglacial Lake Oeschinen (46°30′N, 7°44′E, 1,580 m a.s.l., north–west Swiss Alps) that are sensitive to rainfall for the period AD 1901–2008. We collected two sediment cores, one in 2007 and another in 2011. The sediments are characterized by two facies: (A) mm-laminated clastic varves and (B) turbidites. The annual character of the laminae couplets was confirmed by radiometric dating (210Pb, 137Cs) and independent flood-layer chronomarkers. Individual varves consist of a dark sand-size spring-summer layer enriched in siliciclastic minerals and a lighter clay-size calcite-rich winter layer. Three subtypes of varves are distinguished: Type I with a 1–1.5 mm fining upward sequence; Type II with a distinct fine-sand base up to 3 mm thick; and Type III containing multiple internal microlaminae caused by individual summer rainstorm deposits. Delta-fan surface samples and sediment trap data fingerprint different sediment source areas and transport processes from the watershed and confirm the instant response of sediment flux to rainfall and erosion. Based on a highly accurate, precise and reproducible chronology, we demonstrate that sediment accumulation (varve thickness) is a quantitative predictor for cumulative boreal alpine spring (May–June) and spring/summer (May–August) rainfall (rMJ = 0.71, rMJJA = 0.60, p < 0.01). Bootstrap-based verification of the calibration model reveals a root mean squared error of prediction (RMSEPMJ = 32.7 mm, RMSEPMJJA = 57.8 mm) which is on the order of 10–13 % of mean MJ and MJJA cumulative precipitation, respectively. These results highlight the potential of the Lake Oeschinen sediments for high-resolution reconstructions of past rainfall conditions in the northern Swiss Alps, central and eastern France and south-west Germany.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The range of novel psychoactive substances (NPS) including phenethylamines, cathinones, piperazines, tryptamines, etc. is continuously growing. Therefore, fast and reliable screening methods for these compounds are essential and needed. The use of dried blood spots (DBS) for a fast straightforward approach helps to simplify and shorten sample preparation significantly. DBS were produced from 10 µl of whole blood and extracted offline with 500 µl methanol followed by evaporation and reconstitution in mobile phase. Reversed-phase chromatographic separation and mass spectrometric detection (RP-LC-MS/MS) was achieved within a run time of 10 min. The screening method was validated by evaluating the following parameters: limit of detection (LOD), matrix effect, selectivity and specificity, extraction efficiency, and short-term and long-term stability. Furthermore, the method was applied to authentic samples and results were compared with those obtained with a validated whole blood method used for Routine analysis of NPS. LOD was between 1 and 10 ng/ml. No interference from Matrix compounds was observed. The method was proven to be specific and selective for the analytes, although with limitations for 3-FMC/flephedrone and MDDMA/MDEA. Mean extraction efficiency was 84.6 %. All substances were stable in DBS for at least a week when cooled. Cooling was essential for the stability of cathinones. Prepared samples were stable for at least 3 days. Comparison to the validated whole blood method yielded similar results. DBS were shown to be useful in developing a rapid screening method for NPS with simplified sample preparation. Copyright © 2013 John Wiley & Sons, Ltd

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of group-randomized trials is particularly widespread in the evaluation of health care, educational, and screening strategies. Group-randomized trials represent a subset of a larger class of designs often labeled nested, hierarchical, or multilevel and are characterized by the randomization of intact social units or groups, rather than individuals. The application of random effects models to group-randomized trials requires the specification of fixed and random components of the model. The underlying assumption is usually that these random components are normally distributed. This research is intended to determine if the Type I error rate and power are affected when the assumption of normality for the random component representing the group effect is violated. ^ In this study, simulated data are used to examine the Type I error rate, power, bias and mean squared error of the estimates of the fixed effect and the observed intraclass correlation coefficient (ICC) when the random component representing the group effect possess distributions with non-normal characteristics, such as heavy tails or severe skewness. The simulated data are generated with various characteristics (e.g. number of schools per condition, number of students per school, and several within school ICCs) observed in most small, school-based, group-randomized trials. The analysis is carried out using SAS PROC MIXED, Version 6.12, with random effects specified in a random statement and restricted maximum likelihood (REML) estimation specified. The results from the non-normally distributed data are compared to the results obtained from the analysis of data with similar design characteristics but normally distributed random effects. ^ The results suggest that the violation of the normality assumption for the group component by a skewed or heavy-tailed distribution does not appear to influence the estimation of the fixed effect, Type I error, and power. Negative biases were detected when estimating the sample ICC and dramatically increased in magnitude as the true ICC increased. These biases were not as pronounced when the true ICC was within the range observed in most group-randomized trials (i.e. 0.00 to 0.05). The normally distributed group effect also resulted in bias ICC estimates when the true ICC was greater than 0.05. However, this may be a result of higher correlation within the data. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, we develop an adaptive framework for Monte Carlo rendering, and more specifically for Monte Carlo Path Tracing (MCPT) and its derivatives. MCPT is attractive because it can handle a wide variety of light transport effects, such as depth of field, motion blur, indirect illumination, participating media, and others, in an elegant and unified framework. However, MCPT is a sampling-based approach, and is only guaranteed to converge in the limit, as the sampling rate grows to infinity. At finite sampling rates, MCPT renderings are often plagued by noise artifacts that can be visually distracting. The adaptive framework developed in this thesis leverages two core strategies to address noise artifacts in renderings: adaptive sampling and adaptive reconstruction. Adaptive sampling consists in increasing the sampling rate on a per pixel basis, to ensure that each pixel value is below a predefined error threshold. Adaptive reconstruction leverages the available samples on a per pixel basis, in an attempt to have an optimal trade-off between minimizing the residual noise artifacts and preserving the edges in the image. In our framework, we greedily minimize the relative Mean Squared Error (rMSE) of the rendering by iterating over sampling and reconstruction steps. Given an initial set of samples, the reconstruction step aims at producing the rendering with the lowest rMSE on a per pixel basis, and the next sampling step then further reduces the rMSE by distributing additional samples according to the magnitude of the residual rMSE of the reconstruction. This iterative approach tightly couples the adaptive sampling and adaptive reconstruction strategies, by ensuring that we only sample densely regions of the image where adaptive reconstruction cannot properly resolve the noise. In a first implementation of our framework, we demonstrate the usefulness of our greedy error minimization using a simple reconstruction scheme leveraging a filterbank of isotropic Gaussian filters. In a second implementation, we integrate a powerful edge aware filter that can adapt to the anisotropy of the image. Finally, in a third implementation, we leverage auxiliary feature buffers that encode scene information (such as surface normals, position, or texture), to improve the robustness of the reconstruction in the presence of strong noise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chrysophyte cysts are recognized as powerful proxies of cold-season temperatures. In this paper we use the relationship between chrysophyte assemblages and the number of days below 4 °C (DB4 °C) in the epilimnion of a lake in northern Poland to develop a transfer function and to reconstruct winter severity in Poland for the last millennium. DB4 °C is a climate variable related to the length of the winter. Multivariate ordination techniques were used to study the distribution of chrysophytes from sediment traps of 37 low-land lakes distributed along a variety of environmental and climatic gradients in northern Poland. Of all the environmental variables measured, stepwise variable selection and individual Redundancy analyses (RDA) identified DB4 °C as the most important variable for chrysophytes, explaining a portion of variance independent of variables related to water chemistry (conductivity, chlorides, K, sulfates), which were also important. A quantitative transfer function was created to estimate DB4 °C from sedimentary assemblages using partial least square regression (PLS). The two-component model (PLS-2) had a coefficient of determination of View the MathML sourceRcross2 = 0.58, with root mean squared error of prediction (RMSEP, based on leave-one-out) of 3.41 days. The resulting transfer function was applied to an annually-varved sediment core from Lake Żabińskie, providing a new sub-decadal quantitative reconstruction of DB4 °C with high chronological accuracy for the period AD 1000–2010. During Medieval Times (AD 1180–1440) winters were generally shorter (warmer) except for a decade with very long and severe winters around AD 1260–1270 (following the AD 1258 volcanic eruption). The 16th and 17th centuries and the beginning of the 19th century experienced very long severe winters. Comparison with other European cold-season reconstructions and atmospheric indices for this region indicates that large parts of the winter variability (reconstructed DB4 °C) is due to the interplay between the oscillations of the zonal flow controlled by the North Atlantic Oscillation (NAO) and the influence of continental anticyclonic systems (Siberian High, East Atlantic/Western Russia pattern). Differences with other European records are attributed to geographic climatological differences between Poland and Western Europe (Low Countries, Alps). Striking correspondence between the combined volcanic and solar forcing and the DB4 °C reconstruction prior to the 20th century suggests that winter climate in Poland responds mostly to natural forced variability (volcanic and solar) and the influence of unforced variability is low.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A state-of-the-art inverse model, CarbonTracker Data Assimilation Shell (CTDAS), was used to optimize estimates of methane (CH4) surface fluxes using atmospheric observations of CH4 as a constraint. The model consists of the latest version of the TM5 atmospheric chemistry-transport model and an ensemble Kalman filter based data assimilation system. The model was constrained by atmospheric methane surface concentrations, obtained from the World Data Centre for Greenhouse Gases (WDCGG). Prior methane emissions were specified for five sources: biosphere, anthropogenic, fire, termites and ocean, of which bio-sphere and anthropogenic emissions were optimized. Atmospheric CH 4 mole fractions for 2007 from northern Finland calculated from prior and optimized emissions were compared with observations. It was found that the root mean squared errors of the posterior esti - mates were more than halved. Furthermore, inclusion of NOAA observations of CH 4 from weekly discrete air samples collected at Pallas improved agreement between posterior CH 4 mole fraction estimates and continuous observations, and resulted in reducing optimized biosphere emissions and their uncertainties in northern Finland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Image-based modeling is a popular approach to perform patient-specific biomechanical simulations. Accurate modeling is critical for orthopedic application to evaluate implant design and surgical planning. It has been shown that bone strength can be estimated from the bone mineral density (BMD) and trabecular bone architecture. However, these findings cannot be directly and fully transferred to patient-specific modeling since only BMD can be derived from clinical CT. Therefore, the objective of this study was to propose a method to predict the trabecular bone structure using a µCT atlas and an image registration technique. The approach has been evaluated on femurs and patellae under physiological loading. The displacement and ultimate force for femurs loaded in stance position were predicted with an error of 2.5% and 3.7%, respectively, while predictions obtained with an isotropic material resulted in errors of 7.3% and 6.9%. Similar results were obtained for the patella, where the strain predicted using the registration approach resulted in an improved mean squared error compared to the isotropic model. We conclude that the registration of anisotropic information from of a single template bone enables more accurate patient-specific simulations from clinical image datasets than isotropic model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Several parameters of heart rate variability (HRV) have been shown to predict the risk of sudden cardiac death (SCD) in cardiac patients. There is consensus that risk prediction is increased when measuring HRV during specific provocations such as orthostatic challenge. For the first time, we provide data on reproducibility of such a test in patients with a history of acute coronary syndrome. METHODS: Sixty male patients (65+/-8years) with a history of acute coronary syndrome on stable medication were included. HRV was measured in supine (5min) and standing (5min) position on 2 occasions separated by two weeks. For risk assessment relevant time-domain [standard deviation of all R-R intervals (SDNN) and root mean squared standard differences between adjacent R-R intervals (RMSSD)], frequency domain [low-frequency power (LF), high-frequency power (HF) and LF/HF power ratio] and short-term fractal scaling component (DF1) were computed. Absolute reproducibility was assessed with the standard errors of the mean (SEM) and 95% limits of random variation, and relative reproducibility by the intraclass correlation coefficient (ICC). RESULTS: We found comparable SEMs and ICCs in supine position and after an orthostatic challenge test. All ICCs were good to excellent (ICCs between 0.636 and 0.869). CONCLUSIONS: Reproducibility of HRV parameters during orthostatic challenge is good and comparable with supine position.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the difficulties in the practical application of ridge regression is that, for a given data set, it is unknown whether a selected ridge estimator has smaller squared error than the least squares estimator. The concept of the improvement region is defined, and a technique is developed which obtains approximate confidence intervals for the value of ridge k which produces the maximum reduction in mean squared error. Two simulation experiments were conducted to investigate how accurate these approximate confidence intervals might be. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La información básica sobre el relieve de una cuenca hidrográfica, mediante metodologías analítico-descriptivas, permite a quienes evalúan proyectos relacionados con el uso de los recursos naturales, tales como el manejo integrado de cuencas, estudios sobre impacto ambiental, degradación de suelos, deforestación, conservación de los recursos hídricos, entre otros, contar para su análisis con los parámetros físicos necesarios. Estos procesos mencionados tienen un fuerte componente espacial y el empleo de Sistemas de Información Geográfica (SIG) son de suma utilidad, siendo los Modelos Digitales de Elevación (DEM) y sus derivados un componente relevante de esta base de datos. Los productos derivados de estos modelos, como pendiente, orientación o curvatura, resultarán tan precisos como el DEM usado para derivarlos. Por otra parte, es fundamental maximizar la habilidad del modelo para representar las variaciones del terreno; para ello se debe seleccionar una adecuada resolución (grilla) de acuerdo con los datos disponibles para su generación. En este trabajo se evalúa la calidad altimétrica de seis DEMs generados a partir de dos sistemas diferentes de captura de datos fuente y de distintas resoluciones de grilla. Para determinar la exactitud de los DEMs habitualmente se utiliza un grupo de puntos de control considerados como "verdad de campo" que se comparan con los generados por el modelo en la misma posición geográfica. El área seleccionada para realizar el estudio está ubicada en la localidad de Arrecifes, provincia de Buenos Aires (Argentina) y tiene una superficie de aproximadamente 120 ha. Los resultados obtenidos para los dos algoritmos y para los tres tamaños de grilla analizados presentaron los siguientes resultados: el algoritmo DEM from contourn, un RMSE (Root Mean Squared Error) de ± 0,11 m (para grilla de 1 m), ± 0,11 m (para grilla de 5 m) y de ± 0,15 m (para grilla de 10 m). Para el algoritmo DEM from vector/points, un RMSE de ± 0,09 m (para grilla de 1 m), ± 0,11 m (para grilla de 5 m) y de ± 0,11 m (para grilla de 10 m). Los resultados permiten concluir que el DEM generado a partir de puntos acotados del terreno como datos fuente y con el menor tamaño de grilla es el único que satisface los valores enumerados en la bibliografía, tanto nacional como internacional, lo que lo hace apto para proyectos relacionados con recursos naturales a nivel de ecotopo (predial). El resto de los DEMs generados presentan un RMSE que permite asegurar su aptitud para la evaluación de proyectos relacionados con el uso de los recursos naturales a nivel de unidad de paisaje (conjunto de ecotopos).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Up to now, snow cover on Antarctic sea ice and its impact on radar backscatter, particularly after the onset of freeze/thaw processes, are not well understood. Here we present a combined analysis of in situ observations of snow properties from the landfast sea ice in Atka Bay, Antarctica, and high-resolution TerraSAR-X backscatter data, for the transition from austral spring (November 2012) to summer (January 2013). The physical changes in the seasonal snow cover during that time are reflected in the evolution of TerraSAR-X backscatter. We are able to explain 76-93% of the spatio-temporal variability of the TerraSAR-X backscatter signal with up to four snowpack parameters with a root-mean-squared error of 0.87-1.62 dB, using a simple multiple linear model. Over the complete study, and especially after the onset of early-melt processes and freeze/thaw cycles, the majority of variability in the backscatter is influenced by changes in snow/ice interface temperature, snow depth and top-layer grain size. This suggests it may be possible to retrieve snow physical properties over Antarctic sea ice from X-band SAR backscatter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Schwalbenberg II loess-paleosol sequence (LPS) denotes a key site for Marine Isotope Stage (MIS 3) in Western Europe owing to eight succeeding cambisols, which primarily constitute the Ahrgau Subformation. Therefore, this LPS qualifies as a test candidate for the potential of temporal high-resolution geochemical data obtained X-ray fluorescence (XRF) scanning of discrete samplesproviding a fast and non-destructive tool for determining the element composition. The geochemical data is first contextualized to existing proxy data such as magnetic susceptibility (MS) and organic carbon (Corg) and then aggregated to element log ratios characteristic for weathering intensity [LOG (Ca/Sr), LOG (Rb/Sr), LOG (Ba/Sr), LOG (Rb/K)] and dust provenance [LOG (Ti/Zr), LOG (Ti/Al), LOG (Si/Al)]. Generally, an interpretation of rock magnetic particles is challenged in western Europe, where not only magnetic enhancement but also depletion plays a role. Our data indicates leaching and top-soil erosion induced MS depletion at the Schwalbenberg II LPS. Besides weathering, LOG (Ca/Sr) is susceptible for secondary calcification. Thus, also LOG (Rb/Sr) and LOG (Ba/Sr) are shown to be influenced by calcification dynamics. Consequently, LOG (Rb/K) seems to be the most suitable weathering index identifying the Sinzig Soils S1 and S2 as the most pronounced paleosols for this site. Sinzig Soil S3 is enclosed by gelic gleysols and in contrast to S1 and S2 only initially weathered pointing to colder climate conditions. Also the Remagen Soils are characterized by subtle to moderate positive excursions in the weathering indices. Comparing the Schwalbenberg II LPS with the nearby Eifel Lake Sediment Archive (ELSA) and other more distant German, Austrian and Czech LPS while discussing time and climate as limiting factors for pedogenesis, we suggest that the lithologically determined paleosols are in-situ soil formations. The provenance indices document a Zr-enrichment at the transition from the Ahrgau to the Hesbaye Subformation. This is explained by a conceptual model incorporating multiple sediment recycling and sorting effects in eolian and fluvial domains.