887 resultados para Bias-corrected average forecast
Resumo:
An important step to assess water availability is to have monthly time series representative of the current situation. In this context, a simple methodology is presented for application in large-scale studies in regions where a properly calibrated hydrologic model is not available, using the output variables simulated by regional climate models (RCMs) of the European project PRUDENCE under current climate conditions (period 1961–1990). The methodology compares different interpolation methods and alternatives to generate annual times series that minimise the bias with respect to observed values. The objective is to identify the best alternative to obtain bias-corrected, monthly runoff time series from the output of RCM simulations. This study uses information from 338 basins in Spain that cover the entire mainland territory and whose observed values of natural runoff have been estimated by the distributed hydrological model SIMPA. Four interpolation methods for downscaling runoff to the basin scale from 10 RCMs are compared with emphasis on the ability of each method to reproduce the observed behaviour of this variable. The alternatives consider the use of the direct runoff of the RCMs and the mean annual runoff calculated using five functional forms of the aridity index, defined as the ratio between potential evapotranspiration and precipitation. In addition, the comparison with respect to the global runoff reference of the UNH/GRDC dataset is evaluated, as a contrast of the “best estimator” of current runoff on a large scale. Results show that the bias is minimised using the direct original interpolation method and the best alternative for bias correction of the monthly direct runoff time series of RCMs is the UNH/GRDC dataset, although the formula proposed by Schreiber (1904) also gives good results
Resumo:
Esta Tesis realiza una contribución metodológica al estudio del impacto del cambio climático sobre los usos del agua, centrándose particularmente en la agricultura. Tomando en consideración su naturaleza distinta, la metodología aborda de forma integral los impactos sobre la agricultura de secano y la agricultura de regadío. Para ello incorpora diferentes modelos agrícolas y de agua que conjuntamente con las simulaciones de los escenarios climáticos permiten determinar indicadores de impacto basados en la productividad de los cultivos, para el caso de la agricultura de secano, e indicadores de impacto basados en la disponibilidad de agua para irrigación, para el caso de la agricultura de regadío. La metodología toma en consideración el efecto de la variabilidad climática en la agricultura, evaluando las necesidades de adaptación y gestión asociadas a los impactos medios y a la variabilidad en la productividad de los cultivos y el efecto de la variabilidad hidrológica en la disponibilidad de agua para regadío. Considerando la gran cantidad de información proporcionada por las salidas de las simulaciones de los escenarios climáticos y su complejidad para procesarla, se ha desarrollado una herramienta de cálculo automatizada que integra diferentes escenarios climáticos, métodos y modelos que permiten abordar el impacto del cambio climático sobre la agricultura, a escala de grandes extensiones. El procedimiento metodológico parte del análisis de los escenarios climáticos en situación actual (1961-1990) y futura (2071-2100) para determinar su fiabilidad y conocer qué dicen exactamente las proyecciones climáticas a cerca de los impactos esperados en las principales variables que intervienen en el ciclo hidrológico. El análisis hidrológico se desarrolla en los ámbitos territoriales de la planificación hidrológica en España, considerando la disponibilidad de información para validar los resultados en escenario de control. Se utilizan como datos observados las series de escorrentía en régimen natural estimadas el modelo hidrológico SIMPA que está calibrado en la totalidad del territorio español. Al trabajar a escala de grandes extensiones, la limitada disponibilidad de datos o la falta de modelos hidrológicos correctamente calibrados para obtener los valores de escorrentía, muchas veces dificulta el proceso de evaluación, por tanto, en este estudio se plantea una metodología que compara diferentes métodos de interpolación y alternativas para generar series anuales de escorrentía que minimicen el sesgo con respecto a los valores observados. Así, en base a la alternativa que genera los mejores resultados, se obtienen series mensuales corregidas a partir de las simulaciones de los modelos climáticos regionales (MCR). Se comparan cuatro métodos de interpolación para obtener los valores de las variables a escala de cuenca hidrográfica, haciendo énfasis en la capacidad de cada método para reproducir los valores observados. Las alternativas utilizadas consideran la utilización de la escorrentía directa simulada por los MCR y la escorrentía media anual calculada utilizando cinco fórmulas climatológicas basadas en el índice de aridez. Los resultados se comparan además con la escorrentía global de referencia proporcionada por la UNH/GRDC que en la actualidad es el “mejor estimador” de la escorrentía actual a gran escala. El impacto del cambio climático en la agricultura de secano se evalúa considerando el efecto combinado de los riesgos asociados a las anomalías dadas por los cambios en la media y la variabilidad de la productividad de los cultivos en las regiones agroclimáticas de Europa. Este procedimiento facilita la determinación de las necesidades de adaptación y la identificación de los impactos regionales que deben ser abordados con mayor urgencia en función de los riesgos y oportunidades identificadas. Para ello se utilizan funciones regionales de productividad que han sido desarrolladas y calibradas en estudios previos en el ámbito europeo. Para el caso de la agricultura de regadío, se utiliza la disponibilidad de agua para irrigación como un indicador del impacto bajo escenarios de cambio climático. Considerando que la mayoría de estudios se han centrado en evaluar la disponibilidad de agua en régimen natural, en este trabajo se incorpora el efecto de las infraestructuras hidráulicas al momento de calcular el recurso disponible bajo escenarios de cambio climático Este análisis se desarrolla en el ámbito español considerando la disponibilidad de información, tanto de las aportaciones como de los modelos de explotación de los sistemas hidráulicos. Para ello se utiliza el modelo de gestión de recursos hídricos WAAPA (Water Availability and Adaptation Policy Assessment) que permite calcular la máxima demanda que puede atenderse bajo determinados criterios de garantía. Se utiliza las series mensuales de escorrentía observadas y las series mensuales de escorrentía corregidas por la metodología previamente planteada con el objeto de evaluar la disponibilidad de agua en escenario de control. Se construyen proyecciones climáticas utilizando los cambios en los valores medios y la variabilidad de las aportaciones simuladas por los MCR y también utilizando una fórmula climatológica basada en el índice de aridez. Se evalúan las necesidades de gestión en términos de la satisfacción de las demandas de agua para irrigación a través de la comparación entre la disponibilidad de agua en situación actual y la disponibilidad de agua bajo escenarios de cambio climático. Finalmente, mediante el desarrollo de una herramienta de cálculo que facilita el manejo y automatización de una gran cantidad de información compleja obtenida de las simulaciones de los MCR se obtiene un proceso metodológico que evalúa de forma integral el impacto del cambio climático sobre la agricultura a escala de grandes extensiones, y a la vez permite determinar las necesidades de adaptación y gestión en función de las prioridades identificadas. ABSTRACT This thesis presents a methodological contribution for studying the impact of climate change on water use, focusing particularly on agriculture. Taking into account the different nature of the agriculture, this methodology addresses the impacts on rainfed and irrigated agriculture, integrating agricultural and water planning models with climate change simulations scenarios in order to determine impact indicators based on crop productivity and water availability for irrigation, respectively. The methodology incorporates the effect of climate variability on agriculture, assessing adaptation and management needs associated with mean impacts, variability in crop productivity and the effect of hydrologic variability on water availability for irrigation. Considering the vast amount of information provided by the outputs of the regional climate model (RCM) simulations and also its complexity for processing it, a tool has been developed to integrate different climate scenarios, methods and models to address the impact of climate change on agriculture at large scale. Firstly, a hydrological analysis of the climate change scenarios is performed under current (1961-1990) and future (2071-2100) situation in order to know exactly what the models projections say about the expected impact on the main variables involved in the hydrological cycle. Due to the availability of information for validating the results in current situation, the hydrological analysis is developed in the territorial areas of water planning in Spain, where the values of naturalized runoff have been estimated by the hydrological model SIMPA, which are used as observed data. By working in large-scale studies, the limited availability of data or lack of properly calibrated hydrological model makes difficult to obtain runoff time series. So as, a methodology is proposed to compare different interpolation methods and alternatives to generate annual times series that minimize the bias with respect to observed values. Thus, the best alternative is selected in order to obtain bias-corrected monthly time series from the RCM simulations. Four interpolation methods for downscaling runoff to the basin scale from different RCM are compared with emphasis on the ability of each method to reproduce the observed behavior of this variable. The alternatives consider the use of the direct runoff of the RCMs and the mean annual runoff calculated using five functional forms of the aridity index. The results are also compared with the global runoff reference provided by the UNH/GRDC dataset, as a contrast of the “best estimator” of current runoff on a large scale. Secondly, the impact of climate change on rainfed agriculture is assessed considering the combined effect of the risks associated with anomalies given by changes in the mean and variability of crop productivity in the agro-climatic regions of Europe. This procedure allows determining adaptation needs based on the regional impacts that must be addressed with greater urgency in light of the risks and opportunities identified. Statistical models of productivity response are used for this purpose which have been developed and calibrated in previous European study. Thirdly, the impact of climate change on irrigated agriculture is evaluated considering the water availability for irrigation as an indicator of the impact. Given that most studies have focused on assessing water availability in natural regime, the effect of regulation is incorporated in this approach. The analysis is developed in the Spanish territory considering the available information of the observed stream flows and the regulation system. The Water Availability and Adaptation Policy Assessment (WAAPA) model is used in this study, which allows obtaining the maximum demand that could be supplied under certain conditions (demand seasonal distribution, water supply system management, and reliability criteria) for different policy alternatives. The monthly bias corrected time series obtained by previous methodology are used in order to assess water availability in current situation. Climate change projections are constructed taking into account the variation in mean and coefficient of variation simulated by the RCM. The management needs are determined by the agricultural demands satisfaction through the comparison between water availability under current conditions and under climate change projections. Therefore, the methodology allows evaluating the impact of climate change on agriculture to large scale, using a tool that facilitates the process of a large amount of complex information provided by the RCM simulations, in order to determine the adaptation and management needs in accordance with the priorities of the indentified impacts.
Resumo:
Climate projections indicate that rising temperatures will affect summer crops in the southern Iberian Peninsula. The aim of this study was to obtain projections of the impacts of rising temperatures, and of higher frequency of extreme events on irrigated maize, and to evaluate some adaptation strategies. The study was conducted at several locations in Andalusia using the CERES-Maize crop model, previously calibrated/validated with local experimental datasets. The simulated climate consisted of projections from regional climate models from the ENSEMBLES project; these were corrected for daily temperature and precipitation with regard to the E-OBS observational dataset. These bias-corrected projections were used with the CERES-Maize model to generate future impacts. Crop model results showed a decrease in maize yield by the end of the 21st century from 6 to 20%, a decrease of up to 25% in irrigation water requirements, and an increase in irrigation water productivity of up to 22%, due to earlier maturity dates and stomatal closure caused by CO2 increase. When adaptation strategies combining earlier sowing dates and cultivar changes were considered, impacts were compensated, and maize yield increased up to 14%, compared with the baseline period (1981-2010), with similar reductions in crop irrigation water requirements. Effects of extreme maximum temperatures rose to 40% at the end of the 21st century, compared with the baseline. Adaptation resulted in an overall reduction in extreme Tmax damages in all locations, with the exception of Granada, where losses were limited to 8%.
Resumo:
Apatite (U-Th-Sm)/He (AHe) thermochronology is increasingly used for reconstructing geodynamic processes of the upper crust and the surface. Results of AHe thermochronology, however, are often in conflict with apatite fission track (AFT) thermochronology, yielding an inverted age-relationship with AHe dates older than AFT dates of the same samples. This effect is mainly explained by radiation damage of apatite, either impeding He diffusion or causing non-thermal annealing of fission tracks. So far, systematic age inversions have only been described for old and slowly cooled terranes, whereas for young and rapidly cooled samples 'too old' AHe dates are usually explained by the presence of undetected U and/or Th-rich micro-inclusions. We report apatite (U-Th-Sm)/He results for rapidly cooled volcanogenic samples deposited in a deep ocean environment with a relatively simple post-depositional thermal history. Robust age constraints are provided independently through sample biostratigraphy. All studied apatites have low U contents (< 5 ppm on average). While AFT dates are largely in agreement with deposition ages, most AHe dates are too old. For leg 43, where deposition age of sampled sediment is 26.5-29.5 Ma, alpha-corrected average AHe dates are up to 45 Ma, indicating overestimations of AHe dates up to 50%. This is explained by He implantation from surrounding host U-Th rich sedimentary components and it is shown that AHe dates can be "corrected" by mechanically abrading the outer part of grains. We recommend that particularly for low U-Th-apatites the possibility of He implantation should be carefully checked before considering the degree to which the alpha-ejection correction should be applied.
Resumo:
Fluid inclusions in variably altered diabase recovered from Ocean Drilling Program Legs 137 and 140 at Hole 504B, Costa Rica Rift, exhibit fluid salinities up to 3.7 times that of seawater values (11.7 wt% NaCl equivalent) and exhibit uncorrected homogenization temperatures of 125°C to 202°C. The liquid-dominated inclusions commonly are entrapped in zones of secondary plagioclase and may be primary in origin. Fluid salinities are similar to compositions of fluids venting on the seafloor (0.4-7.0 wt% NaCl) and overlap with those measured in metabasalt samples recovered from near the Kane Fracture Zone on the Mid-Atlantic Ridge and from the Troodos ophiolite, Cyprus. The salinity variations may reflect hydration reactions involving formation of secondary mineral assemblages under rock-dominated conditions, which modify the ionic strength of hydrothermal fluids by consuming or liberating water and chloride ion. Rare CO2-CH4-bearing inclusions, subjacent to zones where talc after olivine becomes an important secondary mineral phase (1700 mbsf), may have formed due to local interaction of seawater and olivine at low water to rock ratios. Corrected average fluid inclusion homogenization temperatures exhibit a gradient from 159°C at a depth of 1370 mbsf to 183°C at a depth of 1992 mbsf and are in apparent equilibrium with the present conductive downhole temperatures. These data indicate that fluid inclusions may be used to estimate downhole temperatures if logging data are unavailable. The compositional and thermal evolution of the diabase-hosted fluids may reflect late-stage, off-axis circulation and conductive heating of compositionally modified seawater in the sheeted dike complex at Hole 504B.
Resumo:
Using the wisdom of crowds---combining many individual forecasts to obtain an aggregate estimate---can be an effective technique for improving forecast accuracy. When individual forecasts are drawn from independent and identical information sources, a simple average provides the optimal crowd forecast. However, correlated forecast errors greatly limit the ability of the wisdom of crowds to recover the truth. In practice, this dependence often emerges because information is shared: forecasters may to a large extent draw on the same data when formulating their responses.
To address this problem, I propose an elicitation procedure in which each respondent is asked to provide both their own best forecast and a guess of the average forecast that will be given by all other respondents. I study optimal responses in a stylized information setting and develop an aggregation method, called pivoting, which separates individual forecasts into shared and private information and then recombines these results in the optimal manner. I develop a tailored pivoting procedure for each of three information models, and introduce a simple and robust variant that outperforms the simple average across a variety of settings.
In three experiments, I investigate the method and the accuracy of the crowd forecasts. In the first study, I vary the shared and private information in a controlled environment, while the latter two studies examine forecasts in real-world contexts. Overall, the data suggest that a simple minimal pivoting procedure provides an effective aggregation technique that can significantly outperform the crowd average.
Resumo:
The routine analysis for quantization of organic acids and sugars are generally slow methods that involve the use and preparation of several reagents, require trained professional, the availability of special equipment and is expensive. In this context, it has been increasing investment in research whose purpose is the development of substitutive methods to reference, which are faster, cheap and simple, and infrared spectroscopy have been highlighted in this regard. The present study developed multivariate calibration models for the simultaneous and quantitative determination of ascorbic acid, citric, malic and tartaric and sugars sucrose, glucose and fructose, and soluble solids in juices and fruit nectars and classification models for ACP. We used methods of spectroscopy in the near infrared (Near Infrared, NIR) in association with the method regression of partial least squares (PLS). Were used 42 samples between juices and fruit nectars commercially available in local shops. For the construction of the models were performed with reference analysis using high-performance liquid chromatography (HPLC) and refractometry for the analysis of soluble solids. Subsequently, the acquisition of the spectra was done in triplicate, in the spectral range 12500 to 4000 cm-1. The best models were applied to the quantification of analytes in study on natural juices and juice samples produced in the Paraná Southwest Region. The juices used in the application of the models also underwent physical and chemical analysis. Validation of chromatographic methodology has shown satisfactory results, since the external calibration curve obtained R-square value (R2) above 0.98 and coefficient of variation (%CV) for intermediate precision and repeatability below 8.83%. Through the Principal Component Analysis (PCA) was possible to separate samples of juices into two major groups, grape and apple and tangerine and orange, while for nectars groups separated guava and grape, and pineapple and apple. Different validation methods, and pre-processes that were used separately and in combination, were obtained with multivariate calibration models with average forecast square error (RMSEP) and cross validation (RMSECV) errors below 1.33 and 1.53 g.100 mL-1, respectively and R2 above 0.771, except for malic acid. The physicochemical analysis enabled the characterization of drinks, including the pH working range (variation of 2.83 to 5.79) and acidity within the parameters Regulation for each flavor. Regression models have demonstrated the possibility of determining both ascorbic acids, citric, malic and tartaric with successfully, besides sucrose, glucose and fructose by means of only a spectrum, suggesting that the models are economically viable for quality control and product standardization in the fruit juice and nectars processing industry.
Resumo:
De par leur nature scientifique, les sciences économiques visent, entre autre, à observer, qualifier, ainsi que quantifier des phénomènes économiques afin de pouvoir en dégager diverses prévisions. Ce mémoire se penche sur ces prévisions et, plus particulièrement, sur les facteurs pouvant biaiser les prévisionnistes au niveau comportemental en référant à l’effet d’ancrage, un biais propre à l’économie comportementale – une sous-discipline des sciences économiques. Il sera donc question de comprendre, par une analyse selon la discipline que représente l’économie comportementale, ce qui peut les affecter, avec un accent mis sur l’effet d’ancrage plus précisément. L’idée générale de ce dernier est qu’un agent peut être biaisé inconsciemment par la simple connaissance d’une valeur précédente lorsqu’il est demandé de faire une estimation ultérieure. De cette façon, une analyse des salaires des joueurs de la Ligne Nationale de Hockey (NHL) selon leurs performances passées et leurs caractéristiques personnelles, de 2007 à 2016, a été réalisée dans ce travail afin d’en dégager de possibles effets d’ancrage. Il est alors possible de constater que les directeurs généraux des équipes de la ligue agissent généralement de façon sensible et rationnelle lorsque vient le temps d’octroyer des contrats à des joueurs mais, néanmoins, une anomalie persiste lorsqu’on porte attention au rang auquel un joueur a été repêché. Dans un tel contexte, il semble pertinent de se référer à l’économie comportementale afin d’expliquer pourquoi le rang au repêchage reste une variable significative huit ans après l’entrée d’un joueur dans la NHL et qu’elle se comporte à l’inverse de ce que prévoit la théorie à ce sujet.
Resumo:
Global climate change is predicted to have impacts on the frequency and severity of flood events. In this study, output from Global Circulation Models (GCMs) for a range of possible future climate scenarios was used to force hydrologic models for four case study watersheds built using the Soil and Water Assessment Tool (SWAT). GCM output was applied with either the "delta change" method or a bias correction. Potential changes in flood risk are assessed based on modeling results and possible relationships to watershed characteristics. Differences in model outputs when using the two different methods of adjusting GCM output are also compared. Preliminary results indicate that watersheds exhibiting higher proportions of runoff in streamflow are more vulnerable to changes in flood risk. The delta change method appears to be more useful when simulating extreme events as it better preserves daily climate variability as opposed to using bias corrected GCM output.
Resumo:
De par leur nature scientifique, les sciences économiques visent, entre autre, à observer, qualifier, ainsi que quantifier des phénomènes économiques afin de pouvoir en dégager diverses prévisions. Ce mémoire se penche sur ces prévisions et, plus particulièrement, sur les facteurs pouvant biaiser les prévisionnistes au niveau comportemental en référant à l’effet d’ancrage, un biais propre à l’économie comportementale – une sous-discipline des sciences économiques. Il sera donc question de comprendre, par une analyse selon la discipline que représente l’économie comportementale, ce qui peut les affecter, avec un accent mis sur l’effet d’ancrage plus précisément. L’idée générale de ce dernier est qu’un agent peut être biaisé inconsciemment par la simple connaissance d’une valeur précédente lorsqu’il est demandé de faire une estimation ultérieure. De cette façon, une analyse des salaires des joueurs de la Ligne Nationale de Hockey (NHL) selon leurs performances passées et leurs caractéristiques personnelles, de 2007 à 2016, a été réalisée dans ce travail afin d’en dégager de possibles effets d’ancrage. Il est alors possible de constater que les directeurs généraux des équipes de la ligue agissent généralement de façon sensible et rationnelle lorsque vient le temps d’octroyer des contrats à des joueurs mais, néanmoins, une anomalie persiste lorsqu’on porte attention au rang auquel un joueur a été repêché. Dans un tel contexte, il semble pertinent de se référer à l’économie comportementale afin d’expliquer pourquoi le rang au repêchage reste une variable significative huit ans après l’entrée d’un joueur dans la NHL et qu’elle se comporte à l’inverse de ce que prévoit la théorie à ce sujet.
Resumo:
The increasing number of extreme rainfall events, combined with the high population density and the imperviousness of the land surface, makes urban areas particularly vulnerable to pluvial flooding. In order to design and manage cities to be able to deal with this issue, the reconstruction of weather phenomena is essential. Among the most interesting data sources which show great potential are the observational networks of private sensors managed by citizens (crowdsourcing). The number of these personal weather stations is consistently increasing, and the spatial distribution roughly follows population density. Precisely for this reason, they perfectly suit this detailed study on the modelling of pluvial flood in urban environments. The uncertainty associated with these measurements of precipitation is still a matter of research. In order to characterise the accuracy and precision of the crowdsourced data, we carried out exploratory data analyses. A comparison between Netatmo hourly precipitation amounts and observations of the same quantity from weather stations managed by national weather services is presented. The crowdsourced stations have very good skills in rain detection but tend to underestimate the reference value. In detail, the accuracy and precision of crowd- sourced data change as precipitation increases, improving the spread going to the extreme values. Then, the ability of this kind of observation to improve the prediction of pluvial flooding is tested. To this aim, the simplified raster-based inundation model incorporated in the Saferplaces web platform is used for simulating pluvial flooding. Different precipitation fields have been produced and tested as input in the model. Two different case studies are analysed over the most densely populated Norwegian city: Oslo. The crowdsourced weather station observations, bias-corrected (i.e. increased by 25%), showed very good skills in detecting flooded areas.
Resumo:
The large spatial inhomogeneity in transmit B, field (B-1(+)) observable in human MR images at hi h static magnetic fields (B-0) severely impairs image quality. To overcome this effect in brain T-1-weighted images the, MPRAGE sequence was modified to generate two different images at different inversion times MP2RAGE By combining the two images in a novel fashion, it was possible to create T-1-weigthed images where the result image was free of proton density contrast, T-2* contrast, reception bias field, and, to first order transmit field inhomogeneity. MP2RAGE sequence parameters were optimized using Bloch equations to maximize contrast-to-noise ratio per unit of time between brain tissues and minimize the effect of B-1(+) variations through space. Images of high anatomical quality and excellent brain tissue differentiation suitable for applications such as segmentation and voxel-based morphometry were obtained at 3 and 7 T. From such T-1-weighted images, acquired within 12 min, high-resolution 3D T-1 maps were routinely calculated at 7 T with sub-millimeter voxel resolution (0.65-0.85 mm isotropic). T-1 maps were validated in phantom experiments. In humans, the T, values obtained at 7 T were 1.15 +/- 0.06 s for white matter (WM) and 1.92 +/- 0.16 s for grey matter (GM), in good agreement with literature values obtained at lower spatial resolution. At 3 T, where whole-brain acquisitions with 1 mm isotropic voxels were acquired in 8 min the T-1 values obtained (0.81 +/- 0.03 S for WM and 1.35 +/- 0.05 for GM) were once again found to be in very good agreement with values in the literature. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
The large spatial inhomogeneity in transmit B(1) field (B(1)(+)) observable in human MR images at high static magnetic fields (B(0)) severely impairs image quality. To overcome this effect in brain T(1)-weighted images, the MPRAGE sequence was modified to generate two different images at different inversion times, MP2RAGE. By combining the two images in a novel fashion, it was possible to create T(1)-weighted images where the result image was free of proton density contrast, T(2) contrast, reception bias field, and, to first order, transmit field inhomogeneity. MP2RAGE sequence parameters were optimized using Bloch equations to maximize contrast-to-noise ratio per unit of time between brain tissues and minimize the effect of B(1)(+) variations through space. Images of high anatomical quality and excellent brain tissue differentiation suitable for applications such as segmentation and voxel-based morphometry were obtained at 3 and 7 T. From such T(1)-weighted images, acquired within 12 min, high-resolution 3D T(1) maps were routinely calculated at 7 T with sub-millimeter voxel resolution (0.65-0.85 mm isotropic). T(1) maps were validated in phantom experiments. In humans, the T(1) values obtained at 7 T were 1.15+/-0.06 s for white matter (WM) and 1.92+/-0.16 s for grey matter (GM), in good agreement with literature values obtained at lower spatial resolution. At 3 T, where whole-brain acquisitions with 1 mm isotropic voxels were acquired in 8 min, the T(1) values obtained (0.81+/-0.03 s for WM and 1.35+/-0.05 for GM) were once again found to be in very good agreement with values in the literature.
Resumo:
The scope of this study was to estimate calibrated values for dietary data obtained by the Food Frequency Questionnaire for Adolescents (FFQA) and illustrate the effect of this approach on food consumption data. The adolescents were assessed on two occasions, with an average interval of twelve months. In 2004, 393 adolescents participated, and 289 were then reassessed in 2005. Dietary data obtained by the FFQA were calibrated using the regression coefficients estimated from the average of two 24-hour recalls (24HR) of the subsample. The calibrated values were similar to the the 24HR reference measurement in the subsample. In 2004 and 2005 a significant difference was observed between the average consumption levels of the FFQA before and after calibration for all nutrients. With the use of calibrated data the proportion of schoolchildren who had fiber intake below the recommended level increased. Therefore, it is seen that calibrated data can be used to obtain adjusted associations due to reclassification of subjects within the predetermined categories.
Resumo:
Additive and multiplicative models of relative risk were used to measure the effect of cancer misclassification and DS86 random errors on lifetime risk projections in the Life Span Study (LSS) of Hiroshima and Nagasaki atomic bomb survivors. The true number of cancer deaths in each stratum of the cancer mortality cross-classification was estimated using sufficient statistics from the EM algorithm. Average survivor doses in the strata were corrected for DS86 random error ($\sigma$ = 0.45) by use of reduction factors. Poisson regression was used to model the corrected and uncorrected mortality rates with covariates for age at-time-of-bombing, age at-time-of-death and gender. Excess risks were in good agreement with risks in RERF Report 11 (Part 2) and the BEIR-V report. Bias due to DS86 random error typically ranged from $-$15% to $-$30% for both sexes, and all sites and models. The total bias, including diagnostic misclassification, of excess risk of nonleukemia for exposure to 1 Sv from age 18 to 65 under the non-constant relative projection model was $-$37.1% for males and $-$23.3% for females. Total excess risks of leukemia under the relative projection model were biased $-$27.1% for males and $-$43.4% for females. Thus, nonleukemia risks for 1 Sv from ages 18 to 85 (DRREF = 2) increased from 1.91%/Sv to 2.68%/Sv among males and from 3.23%/Sv to 4.02%/Sv among females. Leukemia excess risks increased from 0.87%/Sv to 1.10%/Sv among males and from 0.73%/Sv to 1.04%/Sv among females. Bias was dependent on the gender, site, correction method, exposure profile and projection model considered. Future studies that use LSS data for U.S. nuclear workers may be downwardly biased if lifetime risk projections are not adjusted for random and systematic errors. (Supported by U.S. NRC Grant NRC-04-091-02.) ^