870 resultados para assessment data


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Maximum entropy modeling (Maxent) is a widely used algorithm for predicting species distributions across space and time. Properly assessing the uncertainty in such predictions is non-trivial and requires validation with independent datasets. Notably, model complexity (number of model parameters) remains a major concern in relation to overfitting and, hence, transferability of Maxent models. An emerging approach is to validate the cross-temporal transferability of model predictions using paleoecological data. In this study, we assess the effect of model complexity on the performance of Maxent projections across time using two European plant species (Alnus giutinosa (L.) Gaertn. and Corylus avellana L) with an extensive late Quaternary fossil record in Spain as a study case. We fit 110 models with different levels of complexity under present time and tested model performance using AUC (area under the receiver operating characteristic curve) and AlCc (corrected Akaike Information Criterion) through the standard procedure of randomly partitioning current occurrence data. We then compared these results to an independent validation by projecting the models to mid-Holocene (6000 years before present) climatic conditions in Spain to assess their ability to predict fossil pollen presence-absence and abundance. We find that calibrating Maxent models with default settings result in the generation of overly complex models. While model performance increased with model complexity when predicting current distributions, it was higher with intermediate complexity when predicting mid-Holocene distributions. Hence, models of intermediate complexity resulted in the best trade-off to predict species distributions across time. Reliable temporal model transferability is especially relevant for forecasting species distributions under future climate change. Consequently, species-specific model tuning should be used to find the best modeling settings to control for complexity, notably with paleoecological data to independently validate model projections. For cross-temporal projections of species distributions for which paleoecological data is not available, models of intermediate complexity should be selected.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

After decades of mergers and acquisitions and successive technology trends such as CRM, ERP and DW, the data in enterprise systems is scattered and inconsistent. Global organizations face the challenge of addressing local uses of shared business entities, such as customer and material, and at the same time have a consistent, unique, and consolidate view of financial indicators. In addition, current enterprise systems do not accommodate the pace of organizational changes and immense efforts are required to maintain data. When it comes to systems integration, ERPs are considered “closed” and expensive. Data structures are complex and the “out-of-the-box” integration options offered are not based on industry standards. Therefore expensive and time-consuming projects are undertaken in order to have required data flowing according to business processes needs. Master Data Management (MDM) emerges as one discipline focused on ensuring long-term data consistency. Presented as a technology-enabled business discipline, it emphasizes business process and governance to model and maintain the data related to key business entities. There are immense technical and organizational challenges to accomplish the “single version of the truth” MDM mantra. Adding one central repository of master data might prove unfeasible in a few scenarios, thus an incremental approach is recommended, starting from areas most critically affected by data issues. This research aims at understanding the current literature on MDM and contrasting it with views from professionals. The data collected from interviews revealed details on the complexities of data structures and data management practices in global organizations, reinforcing the call for more in-depth research on organizational aspects of MDM. The most difficult piece of master data to manage is the “local” part, the attributes related to the sourcing and storing of materials in one particular warehouse in The Netherlands or a complex set of pricing rules for a subsidiary of a customer in Brazil. From a practical perspective, this research evaluates one MDM solution under development at a Finnish IT solution-provider. By means of applying an existing assessment method, the research attempts at providing the company with one possible tool to evaluate its product from a vendor-agnostics perspective.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The present study was designed to compare the homeostasis model assessment (HOMA) and quantitative insulin sensitivity check index (QUICKI) with data from forearm metabolic studies of healthy individuals and of subjects in various pathological states. Fifty-five healthy individuals and 112 patients in various pathological states, including type 2 diabetes mellitus, essential hypertension and others, were studied after an overnight fast and for 3 h after ingestion of 75 g of glucose, by HOMA, QUICKI and the forearm technique to estimate muscle uptake of glucose combined with indirect calorimetry (oxidative and non-oxidative glucose metabolism). The patients showed increased HOMA (1.88 ± 0.14 vs 1.13 ± 0.10 pmol/l x mmol/l) and insulin/glucose (I/G) index (1.058.9 ± 340.9 vs 518.6 ± 70.7 pmol/l x (mg/100 ml forearm)-1), and decreased QUICKI (0.36 ± 0.004 vs 0.39 ± 0.006 (µU/ml + mg/dl)-1) compared with the healthy individuals. Analysis of the data for the group as a whole (patients and healthy individuals) showed that the estimate of insulin resistance by HOMA was correlated with data obtained in the forearm metabolic studies (glucose uptake: r = -0.16, P = 0.04; non-oxidative glucose metabolism: r = -0.20. P = 0.01, and I/G index: r = 0.17, P = 0.03). The comparison of QUICKI with data of the forearm metabolic studies showed significant correlation between QUICKI and non-oxidative glucose metabolism (r = 0.17, P = 0.03) or I/G index (r = -0.37, P < 0.0001). The HOMA and QUICKI are good estimates of insulin sensitivity as data derived from forearm metabolic studies involving direct measurements of insulin action on muscle glucose metabolism.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The combined use of both radiosonde data and three-dimensional satellite derived data over ocean and land is useful for a better understanding of atmospheric thermodynamics. Here, an attempt is made to study the ther-modynamic structure of convective atmosphere during pre-monsoon season over southwest peninsular India utilizing satellite derived data and radiosonde data. The stability indices were computed for the selected stations over southwest peninsular India viz: Thiruvananthapuram and Cochin, using the radiosonde data for five pre- monsoon seasons. The stability indices studied for the region are Showalter Index (SI), K Index (KI), Lifted In-dex (LI), Total Totals Index (TTI), Humidity Index (HI), Deep Convective Index (DCI) and thermodynamic pa-rameters such as Convective Available Potential Energy (CAPE) and Convective Inhibition Energy (CINE). The traditional Showalter Index has been modified to incorporate the thermodynamics over tropical region. MODIS data over South Peninsular India is also used for the study. When there is a convective system over south penin-sular India, the value of LI over the region is less than −4. On the other hand, the region where LI is more than 2 is comparatively stable without any convection. Similarly, when KI values are in the range 35 to 40, there is a possibility for convection. The threshold value for TTI is found to be between 50 and 55. Further, we found that prior to convection, dry bulb temperature at 1000, 850, 700 and 500 hPa is minimum and the dew point tem-perature is a maximum, which leads to increase in relative humidity. The total column water vapor is maximum in the convective region and minimum in the stable region. The threshold values for the different stability indices are found to be agreeing with that reported in literature.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As control systems have developed and the implications of poor hygienic practices have become better known, the evaluation of the hygienic status of premises has become more critical. The assessment of the overall status of premises hygiene call provide useful management data indicating whether the premises are improving or whether, whilst still meeting legal requirements, they might be failing to maintain previously high standards. Since the creation, for the United Kingdom, of the meat hygiene service (MHS), one of the aims of the service was to monitor hygiene on different premises to provide a means of comparing standards and to identify and encourage improvements. This desire led to the implementation of a scoring system known as the hygiene assessment system (HAS). This paper analyses English slaughterhouses HAS scores between 1998 and 2005 outlining the main incidents throughout this period, Although rising initially, the later results displayed a clear decrease in the general hygiene scores. These revealing results coincide with the start of a new meat inspection system where, after several years of discussion, risk based inspection is finally coming to a reality within Europe. The paper considers the implications of these changes in the way hygiene standards will be monitored in the future.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The performance of flood inundation models is often assessed using satellite observed data; however these data have inherent uncertainty. In this study we assess the impact of this uncertainty when calibrating a flood inundation model (LISFLOOD-FP) for a flood event in December 2006 on the River Dee, North Wales, UK. The flood extent is delineated from an ERS-2 SAR image of the event using an active contour model (snake), and water levels at the flood margin calculated through intersection of the shoreline vector with LiDAR topographic data. Gauged water levels are used to create a reference water surface slope for comparison with the satellite-derived water levels. Residuals between the satellite observed data points and those from the reference line are spatially clustered into groups of similar values. We show that model calibration achieved using pattern matching of observed and predicted flood extent is negatively influenced by this spatial dependency in the data. By contrast, model calibration using water elevations produces realistic calibrated optimum friction parameters even when spatial dependency is present. To test the impact of removing spatial dependency a new method of evaluating flood inundation model performance is developed by using multiple random subsamples of the water surface elevation data points. By testing for spatial dependency using Moran’s I, multiple subsamples of water elevations that have no significant spatial dependency are selected. The model is then calibrated against these data and the results averaged. This gives a near identical result to calibration using spatially dependent data, but has the advantage of being a statistically robust assessment of model performance in which we can have more confidence. Moreover, by using the variations found in the subsamples of the observed data it is possible to assess the effects of observational uncertainty on the assessment of flooding risk.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Advanced Along-Track Scanning Radiometer (AATSR) was launched on Envisat in March 2002. The AATSR instrument is designed to retrieve precise and accurate global sea surface temperature (SST) that, combined with the large data set collected from its predecessors, ATSR and ATSR-2, will provide a long term record of SST data that is greater than 15 years. This record can be used for independent monitoring and detection of climate change. The AATSR validation programme has successfully completed its initial phase. The programme involves validation of the AATSR derived SST values using in situ radiometers, in situ buoys and global SST fields from other data sets. The results of the initial programme presented here will demonstrate that the AATSR instrument is currently close to meeting its scientific objectives of determining global SST to an accuracy of 0.3 K (one sigma). For night time data, the analysis gives a warm bias of between +0.04 K (0.28 K) for buoys to +0.06 K (0.20 K) for radiometers, with slightly higher errors observed for day time data, showing warm biases of between +0.02 (0.39 K) for buoys to +0.11 K (0.33 K) for radiometers. They show that the ATSR series of instruments continues to be the world leader in delivering accurate space-based observations of SST, which is a key climate parameter.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fire activity has varied globally and continuously since the last glacial maximum (LGM) in response to long-term changes in global climate and shorter-term regional changes in climate, vegetation, and human land use. We have synthesized sedimentary charcoal records of biomass burning since the LGM and present global maps showing changes in fire activity for time slices during the past 21,000 years (as differences in charcoal accumulation values compared to pre-industrial). There is strong broad-scale coherence in fire activity after the LGM, but spatial heterogeneity in the signals increases thereafter. In North America, Europe and southern South America, charcoal records indicate less-than-present fire activity during the deglacial period, from 21,000 to ∼11,000 cal yr BP. In contrast, the tropical latitudes of South America and Africa show greater-than-present fire activity from ∼19,000 to ∼17,000 cal yr BP and most sites from Indochina and Australia show greater-than-present fire activity from 16,000 to ∼13,000 cal yr BP. Many sites indicate greater-than-present or near-present activity during the Holocene with the exception of eastern North America and eastern Asia from 8,000 to ∼3,000 cal yr BP, Indonesia and Australia from 11,000 to 4,000 cal yr BP, and southern South America from 6,000 to 3,000 cal yr BP where fire activity was less than present. Regional coherence in the patterns of change in fire activity was evident throughout the post-glacial period. These complex patterns can largely be explained in terms of large-scale climate controls modulated by local changes in vegetation and fuel load

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This presentation was offered as part of the CUNY Library Assessment Conference, Reinventing Libraries: Reinventing Assessment, held at the City University of New York in June 2014.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis evaluates different sites for a weather measurement system and a suitable PV- simulation for University of Surabaya (UBAYA) in Indonesia/Java. The weather station is able to monitor all common weather phenomena including solar insolation. It is planned to use the data for scientific and educational purposes in the renewable energy studies. During evaluation and installation it falls into place that official specifications from global meteorological organizations could not be meet for some sensors caused by the conditions of UBAYA campus. After arranging the hardware the weather at the site was monitored for period of time. A comparison with different official sources from ground based and satellite bases measurements showed differences in wind and solar radiation. In some cases the monthly average solar insolation was deviating 42 % for satellite-based measurements. For the ground based it was less than 10 %. The average wind speed has a difference of 33 % compared to a source, which evaluated the wind power in Surabaya. The wind direction shows instabilities towards east compared with data from local weather station at the airport. PSET has the chance to get some investments to investigate photovoltaic on there own roof. With several simulations a suitable roof direction and the yearly and monthly outputs are shown. With a 7.7 kWpeak PV installation with the latest crystalline technology on the market 8.82 MWh/year could be achieved with weather data from 2012. Thin film technology could increase the value up to 9.13 MWh/year. However, the roofs have enough area to install PV. Finally the low price of electricity in Indonesia makes it not worth to feed in the energy into the public grid.