884 resultados para Mean squared error


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a vertically resolved zonal mean monthly mean global ozone data set spanning the period 1901 to 2007, called HISTOZ.1.0. It is based on a new approach that combines information from an ensemble of chemistry climate model (CCM) simulations with historical total column ozone information. The CCM simulations incorporate important external drivers of stratospheric chemistry and dynamics (in particular solar and volcanic effects, greenhouse gases and ozone depleting substances, sea surface temperatures, and the quasi-biennial oscillation). The historical total column ozone observations include ground-based measurements from the 1920s onward and satellite observations from 1970 to 1976. An off-line data assimilation approach is used to combine model simulations, observations, and information on the observation error. The period starting in 1979 was used for validation with existing ozone data sets and therefore only ground-based measurements were assimilated. Results demonstrate considerable skill from the CCM simulations alone. Assimilating observations provides additional skill for total column ozone. With respect to the vertical ozone distribution, assimilating observations increases on average the correlation with a reference data set, but does not decrease the mean squared error. Analyses of HISTOZ.1.0 with respect to the effects of El Niño–Southern Oscillation (ENSO) and of the 11 yr solar cycle on stratospheric ozone from 1934 to 1979 qualitatively confirm previous studies that focussed on the post-1979 period. The ENSO signature exhibits a much clearer imprint of a change in strength of the Brewer–Dobson circulation compared to the post-1979 period. The imprint of the 11 yr solar cycle is slightly weaker in the earlier period. Furthermore, the total column ozone increase from the 1950s to around 1970 at northern mid-latitudes is briefly discussed. Indications for contributions of a tropospheric ozone increase, greenhouse gases, and changes in atmospheric circulation are found. Finally, the paper points at several possible future improvements of HISTOZ.1.0.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A lack of quantitative high resolution paleoclimate data from the Southern Hemisphere limits the ability to examine current trends within the context of long-term natural climate variability. This study presents a temperature reconstruction for southern Tasmania based on analyses of a sediment core from Duckhole Lake (43.365°S, 146.875°E). The relationship between non-destructive whole core scanning reflectance spectroscopy measurements in the visible spectrum (380–730 nm) and the instrumental temperature record (ad 1911–2000) was used to develop a calibration-in-time reflectance spectroscopy-based temperature model. Results showed that a trough in reflectance from 650 to 700 nm, which represents chlorophyll and its derivatives, was significantly correlated to annual mean temperature. A calibration model was developed (R = 0.56, p auto < 0.05, root mean squared error of prediction (RMSEP) = 0.21°C, five-year filtered data, calibration period 1911–2000) and applied down-core to reconstruct annual mean temperatures in southern Tasmania over the last c. 950 years. This indicated that temperatures were initially cool c. ad 1050, but steadily increased until the late ad 1100s. After a brief cool period in the ad 1200s, temperatures again increased. Temperatures steadily decreased during the ad 1600s and remained relatively stable until the start of the 20th century when they rapidly decreased, before increasing from ad 1960s onwards. Comparisons with high resolution temperature records from western Tasmania, New Zealand and South America revealed some similarities, but also highlighted differences in temperature variability across the mid-latitudes of the Southern Hemisphere. These are likely due to a combination of factors including the spatial variability in climate between and within regions, and differences between records that document seasonal (i.e. warm season/late summer) versus annual temperature variability. This highlights the need for further records from the mid-latitudes of the Southern Hemisphere in order to constrain past natural spatial and seasonal/annual temperature variability in the region, and to accurately identify and attribute changes to natural variability and/or anthropogenic activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-resolution, well-calibrated records of lake sediments are critically important for quantitative climate reconstructions, but they remain a methodological and analytical challenge. While several comprehensive paleotemperature reconstructions have been developed across Europe, only a few quantitative high-resolution studies exist for precipitation. Here we present a calibration and verification study of lithoclastic sediment proxies from proglacial Lake Oeschinen (46°30′N, 7°44′E, 1,580 m a.s.l., north–west Swiss Alps) that are sensitive to rainfall for the period AD 1901–2008. We collected two sediment cores, one in 2007 and another in 2011. The sediments are characterized by two facies: (A) mm-laminated clastic varves and (B) turbidites. The annual character of the laminae couplets was confirmed by radiometric dating (210Pb, 137Cs) and independent flood-layer chronomarkers. Individual varves consist of a dark sand-size spring-summer layer enriched in siliciclastic minerals and a lighter clay-size calcite-rich winter layer. Three subtypes of varves are distinguished: Type I with a 1–1.5 mm fining upward sequence; Type II with a distinct fine-sand base up to 3 mm thick; and Type III containing multiple internal microlaminae caused by individual summer rainstorm deposits. Delta-fan surface samples and sediment trap data fingerprint different sediment source areas and transport processes from the watershed and confirm the instant response of sediment flux to rainfall and erosion. Based on a highly accurate, precise and reproducible chronology, we demonstrate that sediment accumulation (varve thickness) is a quantitative predictor for cumulative boreal alpine spring (May–June) and spring/summer (May–August) rainfall (rMJ = 0.71, rMJJA = 0.60, p < 0.01). Bootstrap-based verification of the calibration model reveals a root mean squared error of prediction (RMSEPMJ = 32.7 mm, RMSEPMJJA = 57.8 mm) which is on the order of 10–13 % of mean MJ and MJJA cumulative precipitation, respectively. These results highlight the potential of the Lake Oeschinen sediments for high-resolution reconstructions of past rainfall conditions in the northern Swiss Alps, central and eastern France and south-west Germany.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of group-randomized trials is particularly widespread in the evaluation of health care, educational, and screening strategies. Group-randomized trials represent a subset of a larger class of designs often labeled nested, hierarchical, or multilevel and are characterized by the randomization of intact social units or groups, rather than individuals. The application of random effects models to group-randomized trials requires the specification of fixed and random components of the model. The underlying assumption is usually that these random components are normally distributed. This research is intended to determine if the Type I error rate and power are affected when the assumption of normality for the random component representing the group effect is violated. ^ In this study, simulated data are used to examine the Type I error rate, power, bias and mean squared error of the estimates of the fixed effect and the observed intraclass correlation coefficient (ICC) when the random component representing the group effect possess distributions with non-normal characteristics, such as heavy tails or severe skewness. The simulated data are generated with various characteristics (e.g. number of schools per condition, number of students per school, and several within school ICCs) observed in most small, school-based, group-randomized trials. The analysis is carried out using SAS PROC MIXED, Version 6.12, with random effects specified in a random statement and restricted maximum likelihood (REML) estimation specified. The results from the non-normally distributed data are compared to the results obtained from the analysis of data with similar design characteristics but normally distributed random effects. ^ The results suggest that the violation of the normality assumption for the group component by a skewed or heavy-tailed distribution does not appear to influence the estimation of the fixed effect, Type I error, and power. Negative biases were detected when estimating the sample ICC and dramatically increased in magnitude as the true ICC increased. These biases were not as pronounced when the true ICC was within the range observed in most group-randomized trials (i.e. 0.00 to 0.05). The normally distributed group effect also resulted in bias ICC estimates when the true ICC was greater than 0.05. However, this may be a result of higher correlation within the data. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, we develop an adaptive framework for Monte Carlo rendering, and more specifically for Monte Carlo Path Tracing (MCPT) and its derivatives. MCPT is attractive because it can handle a wide variety of light transport effects, such as depth of field, motion blur, indirect illumination, participating media, and others, in an elegant and unified framework. However, MCPT is a sampling-based approach, and is only guaranteed to converge in the limit, as the sampling rate grows to infinity. At finite sampling rates, MCPT renderings are often plagued by noise artifacts that can be visually distracting. The adaptive framework developed in this thesis leverages two core strategies to address noise artifacts in renderings: adaptive sampling and adaptive reconstruction. Adaptive sampling consists in increasing the sampling rate on a per pixel basis, to ensure that each pixel value is below a predefined error threshold. Adaptive reconstruction leverages the available samples on a per pixel basis, in an attempt to have an optimal trade-off between minimizing the residual noise artifacts and preserving the edges in the image. In our framework, we greedily minimize the relative Mean Squared Error (rMSE) of the rendering by iterating over sampling and reconstruction steps. Given an initial set of samples, the reconstruction step aims at producing the rendering with the lowest rMSE on a per pixel basis, and the next sampling step then further reduces the rMSE by distributing additional samples according to the magnitude of the residual rMSE of the reconstruction. This iterative approach tightly couples the adaptive sampling and adaptive reconstruction strategies, by ensuring that we only sample densely regions of the image where adaptive reconstruction cannot properly resolve the noise. In a first implementation of our framework, we demonstrate the usefulness of our greedy error minimization using a simple reconstruction scheme leveraging a filterbank of isotropic Gaussian filters. In a second implementation, we integrate a powerful edge aware filter that can adapt to the anisotropy of the image. Finally, in a third implementation, we leverage auxiliary feature buffers that encode scene information (such as surface normals, position, or texture), to improve the robustness of the reconstruction in the presence of strong noise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chrysophyte cysts are recognized as powerful proxies of cold-season temperatures. In this paper we use the relationship between chrysophyte assemblages and the number of days below 4 °C (DB4 °C) in the epilimnion of a lake in northern Poland to develop a transfer function and to reconstruct winter severity in Poland for the last millennium. DB4 °C is a climate variable related to the length of the winter. Multivariate ordination techniques were used to study the distribution of chrysophytes from sediment traps of 37 low-land lakes distributed along a variety of environmental and climatic gradients in northern Poland. Of all the environmental variables measured, stepwise variable selection and individual Redundancy analyses (RDA) identified DB4 °C as the most important variable for chrysophytes, explaining a portion of variance independent of variables related to water chemistry (conductivity, chlorides, K, sulfates), which were also important. A quantitative transfer function was created to estimate DB4 °C from sedimentary assemblages using partial least square regression (PLS). The two-component model (PLS-2) had a coefficient of determination of View the MathML sourceRcross2 = 0.58, with root mean squared error of prediction (RMSEP, based on leave-one-out) of 3.41 days. The resulting transfer function was applied to an annually-varved sediment core from Lake Żabińskie, providing a new sub-decadal quantitative reconstruction of DB4 °C with high chronological accuracy for the period AD 1000–2010. During Medieval Times (AD 1180–1440) winters were generally shorter (warmer) except for a decade with very long and severe winters around AD 1260–1270 (following the AD 1258 volcanic eruption). The 16th and 17th centuries and the beginning of the 19th century experienced very long severe winters. Comparison with other European cold-season reconstructions and atmospheric indices for this region indicates that large parts of the winter variability (reconstructed DB4 °C) is due to the interplay between the oscillations of the zonal flow controlled by the North Atlantic Oscillation (NAO) and the influence of continental anticyclonic systems (Siberian High, East Atlantic/Western Russia pattern). Differences with other European records are attributed to geographic climatological differences between Poland and Western Europe (Low Countries, Alps). Striking correspondence between the combined volcanic and solar forcing and the DB4 °C reconstruction prior to the 20th century suggests that winter climate in Poland responds mostly to natural forced variability (volcanic and solar) and the influence of unforced variability is low.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Image-based modeling is a popular approach to perform patient-specific biomechanical simulations. Accurate modeling is critical for orthopedic application to evaluate implant design and surgical planning. It has been shown that bone strength can be estimated from the bone mineral density (BMD) and trabecular bone architecture. However, these findings cannot be directly and fully transferred to patient-specific modeling since only BMD can be derived from clinical CT. Therefore, the objective of this study was to propose a method to predict the trabecular bone structure using a µCT atlas and an image registration technique. The approach has been evaluated on femurs and patellae under physiological loading. The displacement and ultimate force for femurs loaded in stance position were predicted with an error of 2.5% and 3.7%, respectively, while predictions obtained with an isotropic material resulted in errors of 7.3% and 6.9%. Similar results were obtained for the patella, where the strain predicted using the registration approach resulted in an improved mean squared error compared to the isotropic model. We conclude that the registration of anisotropic information from of a single template bone enables more accurate patient-specific simulations from clinical image datasets than isotropic model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surface sediments from 68 small lakes in the Alps and 9 well-dated sediment core samples that cover a gradient of total phosphorus (TP) concentrations of 6 to 520 μg TP l-1 were studied for diatom, chrysophyte cyst, cladocera, and chironomid assemblages. Inference models for mean circulation log10 TP were developed for diatoms, chironomids, and benthic cladocera using weighted-averaging partial least squares. After screening for outliers, the final transfer functions have coefficients of determination (r2, as assessed by cross-validation, of 0.79 (diatoms), 0.68 (chironomids), and 0.49 (benthic cladocera). Planktonic cladocera and chrysophytes show very weak relationships to TP and no TP inference models were developed for these biota. Diatoms showed the best relationship with TP, whereas the other biota all have large secondary gradients, suggesting that variables other than TP have a strong influence on their composition and abundance. Comparison with other diatom – TP inference models shows that our model has high predictive power and a low root mean squared error of prediction, as assessed by cross-validation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Strategies are compared for the development of a linear regression model with stochastic (multivariate normal) regressor variables and the subsequent assessment of its predictive ability. Bias and mean squared error of four estimators of predictive performance are evaluated in simulated samples of 32 population correlation matrices. Models including all of the available predictors are compared with those obtained using selected subsets. The subset selection procedures investigated include two stopping rules, C$\sb{\rm p}$ and S$\sb{\rm p}$, each combined with an 'all possible subsets' or 'forward selection' of variables. The estimators of performance utilized include parametric (MSEP$\sb{\rm m}$) and non-parametric (PRESS) assessments in the entire sample, and two data splitting estimates restricted to a random or balanced (Snee's DUPLEX) 'validation' half sample. The simulations were performed as a designed experiment, with population correlation matrices representing a broad range of data structures.^ The techniques examined for subset selection do not generally result in improved predictions relative to the full model. Approaches using 'forward selection' result in slightly smaller prediction errors and less biased estimators of predictive accuracy than 'all possible subsets' approaches but no differences are detected between the performances of C$\sb{\rm p}$ and S$\sb{\rm p}$. In every case, prediction errors of models obtained by subset selection in either of the half splits exceed those obtained using all predictors and the entire sample.^ Only the random split estimator is conditionally (on $\\beta$) unbiased, however MSEP$\sb{\rm m}$ is unbiased on average and PRESS is nearly so in unselected (fixed form) models. When subset selection techniques are used, MSEP$\sb{\rm m}$ and PRESS always underestimate prediction errors, by as much as 27 percent (on average) in small samples. Despite their bias, the mean squared errors (MSE) of these estimators are at least 30 percent less than that of the unbiased random split estimator. The DUPLEX split estimator suffers from large MSE as well as bias, and seems of little value within the context of stochastic regressor variables.^ To maximize predictive accuracy while retaining a reliable estimate of that accuracy, it is recommended that the entire sample be used for model development, and a leave-one-out statistic (e.g. PRESS) be used for assessment. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the difficulties in the practical application of ridge regression is that, for a given data set, it is unknown whether a selected ridge estimator has smaller squared error than the least squares estimator. The concept of the improvement region is defined, and a technique is developed which obtains approximate confidence intervals for the value of ridge k which produces the maximum reduction in mean squared error. Two simulation experiments were conducted to investigate how accurate these approximate confidence intervals might be. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El manejo sustentable de los recursos naturales relacionados con proyectos de utilización de los recursos hídricos (entre otros), requiere en muchos casos de la modificación del relieve existente. Esto conlleva la necesidad de adecuación de la capa homogénea superior del suelo, operación que suele denominarse "sistematización", la cual facilita una distribución más uniforme de las lluvias y del agua de riego. Esta modificación de la capa superior del suelo es realizada en base a un proyecto, cuya inclinación responda a las pendientes naturales o a las establecidas por el diseñador. En la ejecución del diseño proyectado, en superficies superiores a una hectárea, el movimiento de tierra se realiza con equipos pesados, que no aseguran un alto porcentaje de eficiencia en lo que al movimiento de tierra se refiere, ya que parte del material se pierde en el acarreo, pero muy especialmente, por la compactación desuniforme del mismo, asociada con las texturas complejas del suelo a trabajar. El presente trabajo determinó el índice de precisión en la ejecución del proyecto de sistematización a partir de un índice estadístico internacionalmente aceptado, el "Root Mean Squared Error (RMSE)", comparando los valores altimétricos proyectados y los realmente obtenidos luego de la ejecución del proyecto, en tres parcelas con distinta secuencia de labores y maquinaria utilizadas, pero con el mismo tipo de suelo en el área del eje Pilar - La Plata (Argentina). Los resultados obtenidos, que varían de un RMSE de 4 a 6 cm, permiten concluir, para los sitios y las condiciones estudiadas, que no pueden asegurarse en la sistematización índices de precisión en la ejecución de la obra, inferiores a los 4 cm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La información básica sobre el relieve de una cuenca hidrográfica, mediante metodologías analítico-descriptivas, permite a quienes evalúan proyectos relacionados con el uso de los recursos naturales, tales como el manejo integrado de cuencas, estudios sobre impacto ambiental, degradación de suelos, deforestación, conservación de los recursos hídricos, entre otros, contar para su análisis con los parámetros físicos necesarios. Estos procesos mencionados tienen un fuerte componente espacial y el empleo de Sistemas de Información Geográfica (SIG) son de suma utilidad, siendo los Modelos Digitales de Elevación (DEM) y sus derivados un componente relevante de esta base de datos. Los productos derivados de estos modelos, como pendiente, orientación o curvatura, resultarán tan precisos como el DEM usado para derivarlos. Por otra parte, es fundamental maximizar la habilidad del modelo para representar las variaciones del terreno; para ello se debe seleccionar una adecuada resolución (grilla) de acuerdo con los datos disponibles para su generación. En este trabajo se evalúa la calidad altimétrica de seis DEMs generados a partir de dos sistemas diferentes de captura de datos fuente y de distintas resoluciones de grilla. Para determinar la exactitud de los DEMs habitualmente se utiliza un grupo de puntos de control considerados como "verdad de campo" que se comparan con los generados por el modelo en la misma posición geográfica. El área seleccionada para realizar el estudio está ubicada en la localidad de Arrecifes, provincia de Buenos Aires (Argentina) y tiene una superficie de aproximadamente 120 ha. Los resultados obtenidos para los dos algoritmos y para los tres tamaños de grilla analizados presentaron los siguientes resultados: el algoritmo DEM from contourn, un RMSE (Root Mean Squared Error) de ± 0,11 m (para grilla de 1 m), ± 0,11 m (para grilla de 5 m) y de ± 0,15 m (para grilla de 10 m). Para el algoritmo DEM from vector/points, un RMSE de ± 0,09 m (para grilla de 1 m), ± 0,11 m (para grilla de 5 m) y de ± 0,11 m (para grilla de 10 m). Los resultados permiten concluir que el DEM generado a partir de puntos acotados del terreno como datos fuente y con el menor tamaño de grilla es el único que satisface los valores enumerados en la bibliografía, tanto nacional como internacional, lo que lo hace apto para proyectos relacionados con recursos naturales a nivel de ecotopo (predial). El resto de los DEMs generados presentan un RMSE que permite asegurar su aptitud para la evaluación de proyectos relacionados con el uso de los recursos naturales a nivel de unidad de paisaje (conjunto de ecotopos).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Up to now, snow cover on Antarctic sea ice and its impact on radar backscatter, particularly after the onset of freeze/thaw processes, are not well understood. Here we present a combined analysis of in situ observations of snow properties from the landfast sea ice in Atka Bay, Antarctica, and high-resolution TerraSAR-X backscatter data, for the transition from austral spring (November 2012) to summer (January 2013). The physical changes in the seasonal snow cover during that time are reflected in the evolution of TerraSAR-X backscatter. We are able to explain 76-93% of the spatio-temporal variability of the TerraSAR-X backscatter signal with up to four snowpack parameters with a root-mean-squared error of 0.87-1.62 dB, using a simple multiple linear model. Over the complete study, and especially after the onset of early-melt processes and freeze/thaw cycles, the majority of variability in the backscatter is influenced by changes in snow/ice interface temperature, snow depth and top-layer grain size. This suggests it may be possible to retrieve snow physical properties over Antarctic sea ice from X-band SAR backscatter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Schwalbenberg II loess-paleosol sequence (LPS) denotes a key site for Marine Isotope Stage (MIS 3) in Western Europe owing to eight succeeding cambisols, which primarily constitute the Ahrgau Subformation. Therefore, this LPS qualifies as a test candidate for the potential of temporal high-resolution geochemical data obtained X-ray fluorescence (XRF) scanning of discrete samplesproviding a fast and non-destructive tool for determining the element composition. The geochemical data is first contextualized to existing proxy data such as magnetic susceptibility (MS) and organic carbon (Corg) and then aggregated to element log ratios characteristic for weathering intensity [LOG (Ca/Sr), LOG (Rb/Sr), LOG (Ba/Sr), LOG (Rb/K)] and dust provenance [LOG (Ti/Zr), LOG (Ti/Al), LOG (Si/Al)]. Generally, an interpretation of rock magnetic particles is challenged in western Europe, where not only magnetic enhancement but also depletion plays a role. Our data indicates leaching and top-soil erosion induced MS depletion at the Schwalbenberg II LPS. Besides weathering, LOG (Ca/Sr) is susceptible for secondary calcification. Thus, also LOG (Rb/Sr) and LOG (Ba/Sr) are shown to be influenced by calcification dynamics. Consequently, LOG (Rb/K) seems to be the most suitable weathering index identifying the Sinzig Soils S1 and S2 as the most pronounced paleosols for this site. Sinzig Soil S3 is enclosed by gelic gleysols and in contrast to S1 and S2 only initially weathered pointing to colder climate conditions. Also the Remagen Soils are characterized by subtle to moderate positive excursions in the weathering indices. Comparing the Schwalbenberg II LPS with the nearby Eifel Lake Sediment Archive (ELSA) and other more distant German, Austrian and Czech LPS while discussing time and climate as limiting factors for pedogenesis, we suggest that the lithologically determined paleosols are in-situ soil formations. The provenance indices document a Zr-enrichment at the transition from the Ahrgau to the Hesbaye Subformation. This is explained by a conceptual model incorporating multiple sediment recycling and sorting effects in eolian and fluvial domains.