973 resultados para Calibration data


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Continental and marine conditions during the last millennium off Porto, Portugal (the southern pole of the North Atlantic Oscillation, NAO), are reconstructed from a sediment archive through a high-resolution multiproxy study and instrumental evidence. Results show multidecadal variability and sea surface temperatures (SSTs) that correlate well with previously published land and sea-based Northern Hemisphere temperature records, and appear to be responding to long-term solar insolation variability. Precipitation was negatively correlated with the NAO, whereas strong flooding events occurred at times of marked climate cooling (AD 1100-1150 and 1400-1470) and transitions in solar activity. AD 1850 marks a major shift in the phytoplankton community associated with a decoupling of d18O records of 3 planktonic foraminifera species. These changes are interpreted as a response to a reduction in the summer and/or annual upwelling and more frequent fall-winter upwelling-like events. This shift's coincidence with a decrease in SST and the increase in coherence between our data and the Atlantic Multidecadal Oscillation (AMO) confirms the connection of the upwelling variability to the North Atlantic Ocean's surface and thermohaline circulation on a decadal scale. The disappearance of this agreement between the AMO and our records beyond AD 1850 and its coincidence with the beginning of the recent rise in atmospheric CO2 supports the hypothesis of a strong anthropogenic effect on the last ~150 yr of the climate record. Furthermore, it raises an important question of the use of instrumental records as the sole calibration data set for climate reconstructions, as these may not provide the best analogue for climate beyond AD 1730.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Sea-surface temperature (SST) estimates in the sediment core MD01-2390 based on planktonic foraminiferal species abundances using five different transfer function techniques suggest nearly unchanged or unusually higher temperatures in the tropical southern South China Sea (SCS) during the Last Glacial Maximum (LGM) relative to modern temperatures. These results are in contrast to substantial cooling of 2-5 °C inferred by geochemical (Uk'37, Mg/Ca ratios) and terrestrial proxies from the western tropical Pacific region. Using multivariate statistics we show that the glacial southern SCS harboured unique planktonic foraminiferal assemblages that have no modern analogs. Analyses of faunal variation through the core reveal that planktonic foraminiferal assemblages responded to temperature changes inferred from Mg/Ca data but that this signal is subdued by superimposed variations in the relative abundance of Pulleniatina obliquiloculata and Neogloboquadrina pachyderma (dextral). These species occur in glacial samples at proportions that are not observed in the calibration data set. The glacial high abundance of N. pachyderma (dextral) are interpreted to reflect a seasonal (winter) inflow of cold surface water from the northeast via the Bashi Strait due to the combined effects of an intensified winter monsoon, a southward shift of the polar front and the eastward migration of the Kuroshio Current. In contrast, processes controlling the high relative abundances of P. obliquiloculata during the LGM may be unique to the southern SCS. We propose a scenario involving a stronger (winter) mixing or enhanced upwelling due to an intensified winter monsoon that prevented shallow-dwelling, warm indicators to establish larger populations during the LGM. Our results indicate that a no-analog behaviour of planktonic foraminifera faunas is responsible for the warm glacial conditions in this part of the western Pacific warm pool as implied by foraminiferal transfer functions and that a more significant surface cooling in the region as implied by terrestrial and geochemical (Mg/Ca ratios; alkenone unsaturation index) marine proxies is a more likely scenario.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Impact response surfaces (IRSs) depict the response of an impact variable to changes in two explanatory variables as a plotted surface. Here, IRSs of spring and winter wheat yields were constructed from a 25-member ensemble of process-based crop simulation models. Twenty-one models were calibrated by different groups using a common set of calibration data, with calibrations applied independently to the same models in three cases. The sensitivity of modelled yield to changes in temperature and precipitation was tested by systematically modifying values of 1981-2010 baseline weather data to span the range of 19 changes projected for the late 21st century at three locations in Europe.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A procedure for measuring the overheating temperature (ΔT ) of a p-n junction area in the structure of photovoltaic (PV) cells converting laser or solar radiations relative to the ambient temperature has been proposed for the conditions of connecting to an electric load. The basis of the procedure is the measurement of the open-circuit voltage (VO C ) during the initial time period after the fast disconnection of the external resistive load. The simultaneous temperature control on an external heated part of a PV module gives the means for determining the value of VO C at ambient temperature. Comparing it with that measured after switching OFF the load makes the calculation of ΔT possible. Calibration data on the VO C = f(T ) dependences for single-junction AlGaAs/GaAs and triple-junction InGaP/GaAs/Ge PV cells are presented. The temperature dynamics in the PV cells has been determined under flash illumination and during fast commutation of the load. Temperature measurements were taken in two cases: converting continuous laser power by single-junction cells and converting solar power by triple-junction cells operating in the concentrator modules.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Two independent multidisciplinary studies of climatic change during the glacial–Holocene transition (ca. 14,000–9,000 calendar yr B.P.) from Norway and Switzerland have assessed organism responses to the rapid climatic changes and made quantitative temperature reconstructions with modern calibration data sets (transfer functions). Chronology at Kråkenes, western Norway, was derived from calibration of a high-resolution series of 14C dates. Chronologies at Gerzensee and Leysin, Switzerland, were derived by comparison of δ18O in lake carbonates with the δ18O record from the Greenland Ice Core Project. Both studies demonstrate the sensitivity of terrestrial and aquatic organisms to rapid temperature changes and their value for quantitative reconstruction of the magnitudes and rates of the climatic changes. The rates in these two terrestrial records are comparable to those in Greenland ice cores, but the actual temperatures inferred apply to the terrestrial environments of the two regions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hoy en día es común estudiar los patrones globales de biodiversidad a partir de las predicciones generadas por diferentes modelos de nicho ecológico. Habitualmente, estos modelos se calibran con datos procedentes de bases de datos de libre acceso (e.g. GBIF). Sin embargo, a pesar de la facilidad de descarga y de la accesibilidad de los datos, la información almacenada sobre las localidades donde están presentes las especies suele tener sesgos y errores. Estos problemas en los datos de calibración pueden modificar drásticamente las predicciones de los modelos y con ello pueden enmascarar los patrones macroecológicos reales. El objetivo de este trabajo es investigar qué métodos producen resultados más precisos cuando los datos de calibración incluyen sesgos y cuáles producen mejores resultados cuando los datos de calibración tienen, además de sesgos, errores. Para ello creado una especie virtual, hemos proyectado su distribución en la península ibérica, hemos muestreado su distribución de manera sesgada y hemos calibrado dos tipos de modelos de distribución (Bioclim y Maxent) con muestras de distintos tamaños. Nuestros resultados indican que cuando los datos sólo están sesgados, los resultados de Bioclim son mejores que los de Maxent. Sin embargo, Bioclim es extremadamente sensible a la presencia de errores en los datos de calibración. En estas situaciones, el comportamiento de Maxent es mucho más robusto y las predicciones que proporciona son más ajustadas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In spite of the important role played by the Southern Ocean in global climate, the few existing paleoceanographic records in the east Pacific sector do not extend beyond one glacial-interglacial cycle, hindering circumpolar comparison of past sea surface temperature (SST) evolution in the Southern Ocean. Here we present three alkenone-based Pleistocene SST records from the subantarctic and subtropical Pacific. We use a regional core top calibration data set to constrain the choice of calibrations for paleo SST estimation. Our core top data confirm that the alkenone-based UK37 and UK'37 values correlate linearly with the SST, in a similar fashion as the most commonly used laboratory culture-based calibrations even at low temperatures (down to ~1°C), rendering these calibrations appropriate for application in the subantarctic Pacific. However, these alkenone indices yield diverging temporal trends in the Pleistocene SST records. On the basis of the better agreement with d18O records and other SST records in the subantarctic Southern Ocean, we propose that the UK37 is a better index for SST reconstruction in this region than the more commonly used UK'37 index. The UK37-derived SST records suggest glacial cooling of ~8°C and ~4°C in the subantarctic and subtropical Pacific, respectively. Such extent of subantarctic glacial cooling is comparable to that in other sectors of the Southern Ocean, indicating a uniform circumpolar cooling during the Pleistocene. Furthermore, our SST records also imply massive equatorward migrations of the Antarctic Circumpolar Current (ACC) frontal systems and an enhanced transport of ACC water to lower latitudes during glacials by the Peru-Chile Current.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Semipermeable membrane devices (SPMDs) have been used as passive air samplers of semivolatile organic compounds in a range of studies. However, due to a lack of calibration data for polyaromatic hydrocarbons (PAHs), SPMD data have not been used to estimate air concentrations of target PAHs. In this study, SPMDs were deployed for 32 days at two sites in a major metropolitan area in Australia. High-volume active sampling systems (HiVol) were co-deployed at both sites. Using the HiVol air concentration data from one site, SPMD sampling rates were measured for 12 US EPA Priority Pollutant PAHs and then these values were used to determine air concentrations at the second site from SPMD concentrations. Air concentrations were also measured at the second site with co-deployed HiVols to validate the SPMD results. PAHs mostly associated with the vapour phase (Fluorene to Pyrene) dominated both the HiVol and passive air samples. Reproducibility between replicate passive samplers was satisfactory (CV < 20%) for the majority of compounds. Sampling rates ranged between 0.6 and 6.1 m(3) d(-1). SPMD-based air concentrations were calculated at the second site for each compound using these sampling rates and the differences between SPMD-derived air concentrations and those measured using a HiVol were, on average, within a factor of 1.5. The dominant processes for the uptake of PAHs by SPMDs were also assessed. Using the SPMD method described herein, estimates of particulate sorbed airborne PAHs with five rings or greater were within 1.8-fold of HiVol measured values. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Operator Choice Model (OCM) was developed to model the behaviour of operators attending to complex tasks involving interdependent concurrent activities, such as in Air Traffic Control (ATC). The purpose of the OCM is to provide a flexible framework for modelling and simulation that can be used for quantitative analyses in human reliability assessment, comparison between human computer interaction (HCI) designs, and analysis of operator workload. The OCM virtual operator is essentially a cycle of four processes: Scan Classify Decide Action Perform Action. Once a cycle is complete, the operator will return to the Scan process. It is also possible to truncate a cycle and return to Scan after each of the processes. These processes are described using Continuous Time Probabilistic Automata (CTPA). The details of the probability and timing models are specific to the domain of application, and need to be specified using domain experts. We are building an application of the OCM for use in ATC. In order to develop a realistic model we are calibrating the probability and timing models that comprise each process using experimental data from a series of experiments conducted with student subjects. These experiments have identified the factors that influence perception and decision making in simplified conflict detection and resolution tasks. This paper presents an application of the OCM approach to a simple ATC conflict detection experiment. The aim is to calibrate the OCM so that its behaviour resembles that of the experimental subjects when it is challenged with the same task. Its behaviour should also interpolate when challenged with scenarios similar to those used to calibrate it. The approach illustrated here uses logistic regression to model the classifications made by the subjects. This model is fitted to the calibration data, and provides an extrapolation to classifications in scenarios outside of the calibration data. A simple strategy is used to calibrate the timing component of the model, and the results for reaction times are compared between the OCM and the student subjects. While this approach to timing does not capture the full complexity of the reaction time distribution seen in the data from the student subjects, the mean and the tail of the distributions are similar.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Traffic incidents are a major source of traffic congestion on freeways. Freeway traffic diversion using pre-planned alternate routes has been used as a strategy to reduce traffic delays due to major traffic incidents. However, it is not always beneficial to divert traffic when an incident occurs. Route diversion may adversely impact traffic on the alternate routes and may not result in an overall benefit. This dissertation research attempts to apply Artificial Neural Network (ANN) and Support Vector Regression (SVR) techniques to predict the percent of delay reduction from route diversion to help determine whether traffic should be diverted under given conditions. The DYNASMART-P mesoscopic traffic simulation model was applied to generate simulated data that were used to develop the ANN and SVR models. A sample network that comes with the DYNASMART-P package was used as the base simulation network. A combination of different levels of incident duration, capacity lost, percent of drivers diverted, VMS (variable message sign) messaging duration, and network congestion was simulated to represent different incident scenarios. The resulting percent of delay reduction, average speed, and queue length from each scenario were extracted from the simulation output. The ANN and SVR models were then calibrated for percent of delay reduction as a function of all of the simulated input and output variables. The results show that both the calibrated ANN and SVR models, when applied to the same location used to generate the calibration data, were able to predict delay reduction with a relatively high accuracy in terms of mean square error (MSE) and regression correlation. It was also found that the performance of the ANN model was superior to that of the SVR model. Likewise, when the models were applied to a new location, only the ANN model could produce comparatively good delay reduction predictions under high network congestion level.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In 2010, the American Association of State Highway and Transportation Officials (AASHTO) released a safety analysis software system known as SafetyAnalyst. SafetyAnalyst implements the empirical Bayes (EB) method, which requires the use of Safety Performance Functions (SPFs). The system is equipped with a set of national default SPFs, and the software calibrates the default SPFs to represent the agency's safety performance. However, it is recommended that agencies generate agency-specific SPFs whenever possible. Many investigators support the view that the agency-specific SPFs represent the agency data better than the national default SPFs calibrated to agency data. Furthermore, it is believed that the crash trends in Florida are different from the states whose data were used to develop the national default SPFs. In this dissertation, Florida-specific SPFs were developed using the 2008 Roadway Characteristics Inventory (RCI) data and crash and traffic data from 2007-2010 for both total and fatal and injury (FI) crashes. The data were randomly divided into two sets, one for calibration (70% of the data) and another for validation (30% of the data). The negative binomial (NB) model was used to develop the Florida-specific SPFs for each of the subtypes of roadway segments, intersections and ramps, using the calibration data. Statistical goodness-of-fit tests were performed on the calibrated models, which were then validated using the validation data set. The results were compared in order to assess the transferability of the Florida-specific SPF models. The default SafetyAnalyst SPFs were calibrated to Florida data by adjusting the national default SPFs with local calibration factors. The performance of the Florida-specific SPFs and SafetyAnalyst default SPFs calibrated to Florida data were then compared using a number of methods, including visual plots and statistical goodness-of-fit tests. The plots of SPFs against the observed crash data were used to compare the prediction performance of the two models. Three goodness-of-fit tests, represented by the mean absolute deviance (MAD), the mean square prediction error (MSPE), and Freeman-Tukey R2 (R2FT), were also used for comparison in order to identify the better-fitting model. The results showed that Florida-specific SPFs yielded better prediction performance than the national default SPFs calibrated to Florida data. The performance of Florida-specific SPFs was further compared with that of the full SPFs, which include both traffic and geometric variables, in two major applications of SPFs, i.e., crash prediction and identification of high crash locations. The results showed that both SPF models yielded very similar performance in both applications. These empirical results support the use of the flow-only SPF models adopted in SafetyAnalyst, which require much less effort to develop compared to full SPFs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Traffic incidents are a major source of traffic congestion on freeways. Freeway traffic diversion using pre-planned alternate routes has been used as a strategy to reduce traffic delays due to major traffic incidents. However, it is not always beneficial to divert traffic when an incident occurs. Route diversion may adversely impact traffic on the alternate routes and may not result in an overall benefit. This dissertation research attempts to apply Artificial Neural Network (ANN) and Support Vector Regression (SVR) techniques to predict the percent of delay reduction from route diversion to help determine whether traffic should be diverted under given conditions. The DYNASMART-P mesoscopic traffic simulation model was applied to generate simulated data that were used to develop the ANN and SVR models. A sample network that comes with the DYNASMART-P package was used as the base simulation network. A combination of different levels of incident duration, capacity lost, percent of drivers diverted, VMS (variable message sign) messaging duration, and network congestion was simulated to represent different incident scenarios. The resulting percent of delay reduction, average speed, and queue length from each scenario were extracted from the simulation output. The ANN and SVR models were then calibrated for percent of delay reduction as a function of all of the simulated input and output variables. The results show that both the calibrated ANN and SVR models, when applied to the same location used to generate the calibration data, were able to predict delay reduction with a relatively high accuracy in terms of mean square error (MSE) and regression correlation. It was also found that the performance of the ANN model was superior to that of the SVR model. Likewise, when the models were applied to a new location, only the ANN model could produce comparatively good delay reduction predictions under high network congestion level.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work, desorption/ionization mass spectrometry was employed for the analysis of sugars and small platform chemicals that are common intermediates in biomass transformation reactions. Specifically, matrix-assisted laser desorption/ionization (MALDI) and desorption electrospray ionization (DESI) mass spectrometric techniques were employed as alternatives to traditional chromatographic methods. Ionic liquid matrices (ILMs) were designed based on traditional solid MALDI matrices (2,5-dihydroxybenzoic acid (DHB) and α-cyano-4-hydroxycinnamic acid (CHCA)) and 1,3-dialkylimidazolium ionic liquids ([BMIM]Cl, [EMIM]Cl, and [EMIM]OAc) that have been employed as reaction media for biomass transformation reactions such as the conversion of carbohydrates to valuable platform chemicals. Although two new ILMs were synthesized ([EMIM][DHB] and [EMIM][CHCA] from [EMIM]OAc), chloride-containing ILs did not react with matrices and resulted in mixtures of IL and matrix in solution. Compared to the parent solid matrices, much less matrix interference was observed in the low mass region of the mass spectrum (< 500 Da) using each of the IL-matrices. Furthermore, the formation of a true ILM (i.e. a new ion pair) does not appear to be necessary for analyte ionization. MALDI sample preparation techniques were optimized based on the compatibility with analyte, IL and matrix. ILMs and IL-matrix mixtures of DHB allowed for qualitative analysis of glucose, fructose, sucrose and N-acetyl-D-glucosamine. Analogous CHCA-containing ILMs did not result in appreciable analyte signals under similar conditions. Small platform compounds such as 5-hydroxymethylfurfural (HMF) and levulinic acid were not detected by direct analysis using MALDI-MS. Furthermore, sugar analyte signals were only detected at relatively high matrix:IL:analyte ratios (1:1:1) due to significant matrix and analyte suppression by the IL ions. Therefore, chemical modification of analytes with glycidyltrimethylammonium chloride (GTMA) was employed to extend this method to quantitative applications. Derivatization was accomplished in aqueous IL solutions with fair reaction efficiencies (36.9 – 48.4 % glucose conversion). Calibration curves of derivatized glucose-GTMA yielded good linearity in all solvent systems tested, with decreased % RSDs of analyte ion signals in IL solutions as compared to purely aqueous systems (1.2 – 7.2 % and 4.2 – 8.7 %, respectively). Derivatization resulted in a substantial increase in sensitivity for MALDI-MS analyses: glucose was reliably detected at IL:analyte ratios of 100:1 (as compared to 1:1 prior to derivatization). Screening of all test analytes resulted in appreciable analyte signals in MALDI-MS spectra, including both HMF and levulinic acid. Using appropriate internal standards, calibration curves were constructed and this method was employed for monitoring a model dehydration reaction of fructose to HMF in [BMIM]Cl. Calibration curves showed wide dynamic ranges (LOD – 100 ng fructose/μg [BMIM]Cl, LOD – 75 ng HMF/μg [BMIM]Cl) with correlation coefficients of 0.9973 (fructose) and 0.9931 (HMF). LODs were estimated from the calibration data to be 7.2 ng fructose/μg [BMIM]Cl and 7.5 ng HMF/μg [BMIM]Cl, however relatively high S/N ratios at these concentrations indicate that these values are likely overestimated. Application of this method allowed for the rapid acquisition of quantitative data without the need for prior separation of analyte and IL. Finally, small molecule platform chemicals HMF and levulinic acid were qualitatively analyzed by DESI-MS. Both HMF and levulinic acid were easily ionized and the corresponding molecular ions were easily detected in the presence of 10 – 100 times IL, without the need for chemical modification prior to analysis. DESI-MS analysis of ILs in positive and negative ion modes resulted in few ions in the low mass region, showing great potential for the analysis of small molecules in IL media.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Quantitative estimation of surface ocean productivity and bottom water oxygen concentration with benthic foraminifera was attempted using 70 samples from equatorial and North Pacific surface sediments. These samples come from a well defined depth range in the ocean, between 2200 and 3200 m, so that depth related factors do not interfere with the estimation. Samples were selected so that foraminifera were well preserved in the sediments and temperature and salinity were nearly uniform (T = 1.5° C; S = 34.6 per mil). The sample set was also assembled so as to minimize the correlation often seen between surface ocean productivity and bottom water oxygen values (r**2 = 0.23 for prediction purposes in this case). This procedure reduced the chances of spurious results due to correlations between the environmental variables. The samples encompass a range of productivities from about 25 to >300 gC m**-2 yr**-1, and a bottom water oxygen range from 1.8 to 3.5 ml/L. Benthic foraminiferal assemblages were quantified using the >62 µm fraction of the sediments and 46 taxon categories. MANOVA multivariate regression was used to project the faunal matrix onto the two environmental dimensions using published values for productivity and bottom water oxygen to calibrate this operation. The success of this regression was measured with the multivariate r? which was 0.98 for the productivity dimension and 0.96 for the oxygen dimension. These high coefficients indicate that both environmental variables are strongly imbedded in the faunal data matrix. Analysis of the beta regression coefficients shows that the environmental signals are carried by groups of taxa which are consistent with previous work characterizing benthic foraminiferal responses to productivity and bottom water oxygen. The results of this study suggest that benthic foraminiferal assemblages can be used for quantitative reconstruction of surface ocean productivity and bottom water oxygen concentrations if suitable surface sediment calibration data sets are developed and appropriate means for detecting no-analog samples are found.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An expanded Cariaco Basin 14C chronology is tied to 230Th-dated Hulu Cave speleothem records in order to provide detailed marine-based 14C calibration for the past 50,000 years. The revised, high-resolution Cariaco 14C calibration record agrees well with data from 230Th-dated fossil corals back to 33 ka, with continued agreement despite increased scatter back to 50 ka, suggesting that the record provides accurate calibration back to the limits of radiocarbon dating. The calibration data document highly elevated Delta14C during the Glacial period. Carbon cycle box model simulations show that the majority of observed Delta14C change can be explained by increased 14C production. However, from 45 to 15 ka, Delta14C remains anomalously high, indicating that the distribution of radiocarbon between surface and deep ocean reservoirs was different than it is today. Additional observations of the magnitude, spatial extent and timing of deep ocean Delta14C shifts are critical for a complete understanding of observed Glacial Delta14C variability.