996 resultados para Global errors
Resumo:
This paper presents results of the AQL2004 project, which has been develope within the GOFC-GOLD Latin American network of remote sensing and forest fires (RedLatif). The project intended to obtain monthly burned-land maps of the entire region, from Mexico to Patagonia, using MODIS (moderate-resolution imaging spectroradiometer) reflectance data. The project has been organized in three different phases: acquisition and preprocessing of satellite data; discrimination of burned pixels; and validation of results. In the first phase, input data consisting of 32-day composites of MODIS 500-m reflectance data generated by the Global Land Cover Facility (GLCF) of the University of Maryland (College Park, Maryland, U.S.A.) were collected and processed. The discrimination of burned areas was addressed in two steps: searching for "burned core" pixels using postfire spectral indices and multitemporal change detection and mapping of burned scars using contextual techniques. The validation phase was based on visual analysis of Landsat and CBERS (China-Brazil Earth Resources Satellite) images. Validation of the burned-land category showed an agreement ranging from 30% to 60%, depending on the ecosystem and vegetation species present. The total burned area for the entire year was estimated to be 153 215 km2. The most affected countries in relation to their territory were Cuba, Colombia, Bolivia, and Venezuela. Burned areas were found in most land covers; herbaceous vegetation (savannas and grasslands) presented the highest proportions of burned area, while perennial forest had the lowest proportions. The importance of croplands in the total burned area should be taken with reserve, since this cover presented the highest commission errors. The importance of generating systematic products of burned land areas for different ecological processes is emphasized.
Resumo:
The latest coupled configuration of the Met Office Unified Model (Global Coupled configuration 2, GC2) is presented. This paper documents the model components which make up the configuration (although the scientific description of these components is detailed elsewhere) and provides a description of the coupling between the components. The performance of GC2 in terms of its systematic errors is assessed using a variety of diagnostic techniques. The configuration is intended to be used by the Met Office and collaborating institutes across a range of timescales, with the seasonal forecast system (GloSea5) and climate projection system (HadGEM) being the initial users. In this paper GC2 is compared against the model currently used operationally in those two systems. Overall GC2 is shown to be an improvement on the configurations used currently, particularly in terms of modes of variability (e.g. mid-latitude and tropical cyclone intensities, the Madden–Julian Oscillation and El Niño Southern Oscillation). A number of outstanding errors are identified with the most significant being a considerable warm bias over the Southern Ocean and a dry precipitation bias in the Indian and West African summer monsoons. Research to address these is ongoing.
Resumo:
The Arctic is an important region in the study of climate change, but monitoring surface temperatures in this region is challenging, particularly in areas covered by sea ice. Here in situ, satellite and reanalysis data were utilised to investigate whether global warming over recent decades could be better estimated by changing the way the Arctic is treated in calculating global mean temperature. The degree of difference arising from using five different techniques, based on existing temperature anomaly dataset techniques, to estimate Arctic SAT anomalies over land and sea ice were investigated using reanalysis data as a testbed. Techniques which interpolated anomalies were found to result in smaller errors than non-interpolating techniques. Kriging techniques provided the smallest errors in anomaly estimates. Similar accuracies were found for anomalies estimated from in situ meteorological station SAT records using a kriging technique. Whether additional data sources, which are not currently utilised in temperature anomaly datasets, would improve estimates of Arctic surface air temperature anomalies was investigated within the reanalysis testbed and using in situ data. For the reanalysis study, the additional input anomalies were reanalysis data sampled at certain supplementary data source locations over Arctic land and sea ice areas. For the in situ data study, the additional input anomalies over sea ice were surface temperature anomalies derived from the Advanced Very High Resolution Radiometer satellite instruments. The use of additional data sources, particularly those located in the Arctic Ocean over sea ice or on islands in sparsely observed regions, can lead to substantial improvements in the accuracy of estimated anomalies. Decreases in Root Mean Square Error can be up to 0.2K for Arctic-average anomalies and more than 1K for spatially resolved anomalies. Further improvements in accuracy may be accomplished through the use of other data sources.
Resumo:
Intercomparison and evaluation of the global ocean surface mixed layer depth (MLD) fields estimated from a suite of major ocean syntheses are conducted. Compared with the reference MLDs calculated from individual profiles, MLDs calculated from monthly mean and gridded profiles show negative biases of 10–20 m in early spring related to the re-stratification process of relatively deep mixed layers. Vertical resolution of profiles also influences the MLD estimation. MLDs are underestimated by approximately 5–7 (14–16) m with the vertical resolution of 25 (50) m when the criterion of potential density exceeding the 10-m value by 0.03 kg m−3 is used for the MLD estimation. Using the larger criterion (0.125 kg m−3) generally reduces the underestimations. In addition, positive biases greater than 100 m are found in wintertime subpolar regions when MLD criteria based on temperature are used. Biases of the reanalyses are due to both model errors and errors related to differences between the assimilation methods. The result shows that these errors are partially cancelled out through the ensemble averaging. Moreover, the bias in the ensemble mean field of the reanalyses is smaller than in the observation-only analyses. This is largely attributed to comparably higher resolutions of the reanalyses. The robust reproduction of both the seasonal cycle and interannual variability by the ensemble mean of the reanalyses indicates a great potential of the ensemble mean MLD field for investigating and monitoring upper ocean processes.
Resumo:
No presente trabalho, mostram-se equações de estimativa da irradiação solar global (R G), por meio do modelo de Angstrom, com partições sazonal e mensal para a região de Cascavel - PR. Os dados experimentais foram cedidos pelo IAPAR, coletados na sua estação meteorológica localizada na COODETEC/Cascavel - PR, no período de 1983 a 1998. Dos 16 anos de dados, 12 anos foram utilizados para cálculo dos coeficientes (a e b) e quatro anos para a validação das equações. Os coeficientes de determinação encontrados foram superiores a 80% para as duas partições. O mínimo da R G é superestimado e o máximo é subestimado quando comparados com o mínimo e o máximo para dados reais, sendo esses encontrados no solstício de inverno e equinócio de primavera, respectivamente. A variação sazonal e mensal do coeficiente a foi menor (0,16 a 0,19 e 0,14 a 0,21) e do coeficiente b maior (0,34 a 0,43 e 0,32 a 0,44). As maiores variações dos erros médios diários ocorreram no equinócio de primavera (-19,45% a 27,28%) e as menores no equinócio de outono (-11,32% a 10,61%). O ajuste mais eficaz das equações foi encontrado para a partição mensal.
Resumo:
Systematic errors can have a significant effect on GPS observable. In medium and long baselines the major systematic error source are the ionosphere and troposphere refraction and the GPS satellites orbit errors. But, in short baselines, the multipath is more relevant. These errors degrade the accuracy of the positioning accomplished by GPS. So, this is a critical problem for high precision GPS positioning applications. Recently, a method has been suggested to mitigate these errors: the semiparametric model and the penalised least squares technique. It uses a natural cubic spline to model the errors as a function which varies smoothly in time. The systematic errors functions, ambiguities and station coordinates, are estimated simultaneously. As a result, the ambiguities and the station coordinates are estimated with better reliability and accuracy than the conventional least square method.
Resumo:
Among the positioning systems that compose GNSS (Global Navigation Satellite System), GPS has the capability of providing low, medium and high precision positioning data. However, GPS observables may be subject to many different types of errors. These systematic errors can degrade the accuracy of the positioning provided by GPS. These errors are mainly related to GPS satellite orbits, multipath, and atmospheric effects. In order to mitigate these errors, a semiparametric model and the penalized least squares technique were employed in this study. This is similar to changing the stochastical model, in which error functions are incorporated and the results are similar to those in which the functional model is changed instead. Using this method, it was shown that ambiguities and the estimation of station coordinates were more reliable and accurate than when employing a conventional least squares methodology.
Resumo:
The GPS observables are subject to several errors. Among them, the systematic ones have great impact, because they degrade the accuracy of the accomplished positioning. These errors are those related, mainly, to GPS satellites orbits, multipath and atmospheric effects. Lately, a method has been suggested to mitigate these errors: the semiparametric model and the penalised least squares technique (PLS). In this method, the errors are modeled as functions varying smoothly in time. It is like to change the stochastic model, in which the errors functions are incorporated, the results obtained are similar to those in which the functional model is changed. As a result, the ambiguities and the station coordinates are estimated with better reliability and accuracy than the conventional least square method (CLS). In general, the solution requires a shorter data interval, minimizing costs. The method performance was analyzed in two experiments, using data from single frequency receivers. The first one was accomplished with a short baseline, where the main error was the multipath. In the second experiment, a baseline of 102 km was used. In this case, the predominant errors were due to the ionosphere and troposphere refraction. In the first experiment, using 5 minutes of data collection, the largest coordinates discrepancies in relation to the ground truth reached 1.6 cm and 3.3 cm in h coordinate for PLS and the CLS, respectively, in the second one, also using 5 minutes of data, the discrepancies were 27 cm in h for the PLS and 175 cm in h for the CLS. In these tests, it was also possible to verify a considerable improvement in the ambiguities resolution using the PLS in relation to the CLS, with a reduced data collection time interval. © Springer-Verlag Berlin Heidelberg 2007.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
We present a program (Ragu; Randomization Graphical User interface) for statistical analyses of multichannel event-related EEG and MEG experiments. Based on measures of scalp field differences including all sensors, and using powerful, assumption-free randomization statistics, the program yields robust, physiologically meaningful conclusions based on the entire, untransformed, and unbiased set of measurements. Ragu accommodates up to two within-subject factors and one between-subject factor with multiple levels each. Significance is computed as function of time and can be controlled for type II errors with overall analyses. Results are displayed in an intuitive visual interface that allows further exploration of the findings. A sample analysis of an ERP experiment illustrates the different possibilities offered by Ragu. The aim of Ragu is to maximize statistical power while minimizing the need for a-priori choices of models and parameters (like inverse models or sensors of interest) that interact with and bias statistics.
Resumo:
Upper-air observations are a fundamental data source for global atmospheric data products, but uncertainties, particularly in the early years, are not well known. Most of the early observations, which have now been digitized, are prone to a large variety of undocumented uncertainties (errors) that need to be quantified, e.g., for their assimilation in reanalysis projects. We apply a novel approach to estimate errors in upper-air temperature, geopotential height, and wind observations from the Comprehensive Historical Upper-Air Network for the time period from 1923 to 1966. We distinguish between random errors, biases, and a term that quantifies the representativity of the observations. The method is based on a comparison of neighboring observations and is hence independent of metadata, making it applicable to a wide scope of observational data sets. The estimated mean random errors for all observations within the study period are 1.5 K for air temperature, 1.3 hPa for pressure, 3.0 ms−1for wind speed, and 21.4° for wind direction. The estimates are compared to results of previous studies and analyzed with respect to their spatial and temporal variability.
Resumo:
The global ocean is a significant sink for anthropogenic carbon (Cant), absorbing roughly a third of human CO2 emitted over the industrial period. Robust estimates of the magnitude and variability of the storage and distribution of Cant in the ocean are therefore important for understanding the human impact on climate. In this synthesis we review observational and model-based estimates of the storage and transport of Cant in the ocean. We pay particular attention to the uncertainties and potential biases inherent in different inference schemes. On a global scale, three data-based estimates of the distribution and inventory of Cant are now available. While the inventories are found to agree within their uncertainty, there are considerable differences in the spatial distribution. We also present a review of the progress made in the application of inverse and data assimilation techniques which combine ocean interior estimates of Cant with numerical ocean circulation models. Such methods are especially useful for estimating the air–sea flux and interior transport of Cant, quantities that are otherwise difficult to observe directly. However, the results are found to be highly dependent on modeled circulation, with the spread due to different ocean models at least as large as that from the different observational methods used to estimate Cant. Our review also highlights the importance of repeat measurements of hydrographic and biogeochemical parameters to estimate the storage of Cant on decadal timescales in the presence of the variability in circulation that is neglected by other approaches. Data-based Cant estimates provide important constraints on forward ocean models, which exhibit both broad similarities and regional errors relative to the observational fields. A compilation of inventories of Cant gives us a "best" estimate of the global ocean inventory of anthropogenic carbon in 2010 of 155 ± 31 PgC (±20% uncertainty). This estimate includes a broad range of values, suggesting that a combination of approaches is necessary in order to achieve a robust quantification of the ocean sink of anthropogenic CO2.
Resumo:
OBJECTIVES The aim of this study was to identify common risk factors for patient-reported medical errors across countries. In country-level analyses, differences in risks associated with error between health care systems were investigated. The joint effects of risks on error-reporting probability were modelled for hypothetical patients with different health care utilization patterns. DESIGN Data from the Commonwealth Fund's 2010 lnternational Survey of the General Public's Views of their Health Care System's Performance in 11 Countries. SETTING Representative population samples of 11 countries were surveyed (total sample = 19,738 adults). Utilization of health care, coordination of care problems and reported errors were assessed. Regression analyses were conducted to identify risk factors for patients' reports of medical, medication and laboratory errors across countries and in country-specific models. RESULTS Error was reported by 11.2% of patients but with marked differences between countries (range: 5.4-17.0%). Poor coordination of care was reported by 27.3%. The risk of patient-reported error was determined mainly by health care utilization: Emergency care (OR = 1.7, P < 0.001), hospitalization (OR = 1.6, P < 0.001) and the number of providers involved (OR three doctors = 2.0, P < 0.001) are important predictors. Poor care coordination is the single most important risk factor for reporting error (OR = 3.9, P < 0.001). Country-specific models yielded common and country-specific predictors for self-reported error. For high utilizers of care, the probability that errors are reported rises up to P = 0.68. CONCLUSIONS Safety remains a global challenge affecting many patients throughout the world. Large variability exists in the frequency of patient-reported error across countries. To learn from others' errors is not only essential within countries but may also prove a promising strategy internationally.
Resumo:
Sentinel-5 (S5) and its precursor (S5P) are future European satellite missions aiming at global monitoring of methane (CH4) column-average dry air mole fractions (XCH4). The spectrometers to be deployed onboard the satellites record spectra of sunlight backscattered from the Earth's surface and atmosphere. In particular, they exploit CH4 absorption in the shortwave infrared spectral range around 1.65 mu m (S5 only) and 2.35 mu m (both S5 and S5P) wavelength. Given an accuracy goal of better than 2% for XCH4 to be delivered on regional scales, assessment and reduction of potential sources of systematic error such as spectroscopic uncertainties is crucial. Here, we investigate how spectroscopic errors propagate into retrieval errors on the global scale. To this end, absorption spectra of a ground-based Fourier transform spectrometer (FTS) operating at very high spectral resolution serve as estimate for the quality of the spectroscopic parameters. Feeding the FTS fitting residuals as a perturbation into a global ensemble of simulated S5- and S5P-like spectra at relatively low spectral resolution, XCH4 retrieval errors exceed 0.6% in large parts of the world and show systematic correlations on regional scales, calling for improved spectroscopic parameters.
Resumo:
Uncertainty information for global leaf area index (LAI) products is important for global modeling studies but usually difficult to systematically obtain at a global scale. Here, we present a new method that cross-validates existing global LAI products and produces consistent uncertainty information. The method is based on a triple collocation error model (TCEM) that assumes errors among LAI products are not correlated. Global monthly absolute and relative uncertainties, in 0.05° spatial resolutions, were generated for MODIS, CYCLOPES, and GLOBCARBON LAI products, with reasonable agreement in terms of spatial patterns and biome types. CYCLOPES shows the lowest absolute and relative uncertainties, followed by GLOBCARBON and MODIS. Grasses, crops, shrubs, and savannas usually have lower uncertainties than forests in association with the relatively larger forest LAI. With their densely vegetated canopies, tropical regions exhibit the highest absolute uncertainties but the lowest relative uncertainties, the latter of which tend to increase with higher latitudes. The estimated uncertainties of CYCLOPES generally meet the quality requirements (± 0.5) proposed by the Global Climate Observing System (GCOS), whereas for MODIS and GLOBCARBON only non-forest biome types have met the requirement. Nevertheless, none of the products seems to be within a relative uncertainty requirements of 20%. Further independent validation and comparative studies are expected to provide a fair assessment of uncertainties derived from TCEM. Overall, the proposed TCEM is straightforward and could be automated for the systematic processing of real time remote sensing observations to provide theoretical uncertainty information for a wider range of land products.