924 resultados para error model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydrographers have traditionally referred to the nearshore area as the "white ribbon" area due to the challenges associated with the collection of elevation data in this highly dynamic transitional zone between terrestrial and marine environments. Accordingly, available information in this zone is typically characterised by a range of datasets from disparate sources. In this paper we propose a framework to 'fill' the white ribbon area of a coral reef system by integrating multiple elevation and bathymetric datasets acquired by a suite of remote-sensing technologies into a seamless digital elevation model (DEM). A range of datasets are integrated, including field-collected GPS elevation points, terrestrial and bathymetric LiDAR, single and multibeam bathymetry, nautical chart depths and empirically derived bathymetry estimations from optical remote sensing imagery. The proposed framework ranks data reliability internally, thereby avoiding the requirements to quantify absolute error and results in a high resolution, seamless product. Nested within this approach is an effective spatially explicit technique for improving the accuracy of bathymetry estimates derived empirically from optical satellite imagery through modelling the spatial structure of residuals. The approach was applied to data collected on and around Lizard Island in northern Australia. Collectively, the framework holds promise for filling the white ribbon zone in coastal areas characterised by similar data availability scenarios. The seamless DEM is referenced to the horizontal coordinate system MGA Zone 55 - GDA 1994, mean sea level (MSL) vertical datum and has a spatial resolution of 20 m.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Orbital tuning of benthic d18O is a common approach for assigning ages to ocean sediment records. Similar environmental forcing of the northern South China Sea and the southeast Asian cave regions allows for transfer of the speleothem d18O radiometric chronology to the planktonic and benthic d18O records from Ocean Drilling Program Site 1146, yielding a new chronology with 41 radiometrically calibrated datums, spanning the past 350 kyr. This approach also provides for an independent assessment of the accuracy of the orbitally tuned benthic d18O chronology for the last 350 kyr. The largest differences relative to the latest chronology occur in marine isotope stages (MIS) 5.4, 5.5, 6, 7, and 9.3. Prominent suborbital-scale structure believed to be global in nature is identified within MIS 5.4 and MIS 7.2. On the basis of the radiometrically calibrated chronology, the time constant of the ice sheet is found to be 5.4 kyr at the precession band (light d18O lags precession minima by -55.4°) and 10.4 kyr at the obliquity band (light d18O lags obliquity maxima by 57.4°). These values are significantly shorter than the single 17 kyr time constant originally estimated by Imbrie et al. (1984), based primarily on the timing of terminations I and II and the 15 kyr time constant used by Lisiecki and Raymo (2005, doi:10.1029/2004PA001071).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Terrigenous sediment supply, marine transport, and depositional processes along tectonically active margins are key to decoding turbidite successions as potential archives of climatic and seismic forcings. Sequence stratigraphic models predict coarse-grained sediment delivery to deep-marine sites mainly during sea-level fall and lowstand. Marine siliciclastic deposition during transgressions and highstands has been attributed to sustained connectivity between terrigenous sources and marine sinks facilitated by narrow shelves. To decipher the controls on Holocene highstand turbidite deposition, we analyzed 12 sediment cores from spatially discrete, coeval turbidite systems along the Chile margin (29° - 40°S) with changing climatic and geomorphic characteristics but uniform changes in sea level. Sediment cores from intraslope basins in north-central Chile (29° - 33°S) offshore a narrow to absent shelf record a shut-off of turbidite deposition during the Holocene due to postglacial aridification. In contrast, core sites in south-central Chile (36° - 40°S) offshore a wide shelf record frequent turbidite deposition during highstand conditions. Two core sites are linked to the Biobío river-canyon system and receive sediment directly from the river mouth. However, intraslope basins are not connected via canyons to fluvial systems but yield even higher turbidite frequencies. High sediment supply combined with a wide shelf and an undercurrent moving sediment toward the shelf edge appear to control Holocene turbidite sedimentation and distribution. Shelf undercurrents may play an important role in lateral sediment transport and supply to the deep sea and need to be accounted for in sediment-mass balances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Structure from Motion (SfM) is a new form of photogrammetry that automates the rendering of georeferenced 3D models of objects using digital photographs and independently surveyed Ground Control Points (GCPs). This project seeks to quantify the error found in Digital Elevation Models (DEMs) produced using SfM. I modeled a rockslide found at the Cadman Quarry (Monroe, Washington) because the surface is vegetation-free, which is ideal for SfM and Terrestrial LiDAR Scanner (TLS) surveys. By using SfM, TLS, and GPS positioning at the same time, I attempted to find the deviation in the SfM model from the TLS model and GPS points. Using the deviation, I found the Root-Mean-Square Error (RMSE) between the SfM DEM and GPS positions. The RMSE of the SfM model when compared to surveyed GPS points is 17cm. I propagated the uncertainty of the GPS points with the RMSE of the SfM model to find the uncertainty of the SfM model compared to the NAD 1984 datum. The uncertainty of the SfM model compared to the NAD 1984 is 27cm. This study did not produce a model from the TLS that had sufficient resolution on horizontal surfaces to compare to surveyed GPS points.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of presence/absence data in wildlife management and biological surveys is widespread. There is a growing interest in quantifying the sources of error associated with these data. We show that false-negative errors (failure to record a species when in fact it is present) can have a significant impact on statistical estimation of habitat models using simulated data. Then we introduce an extension of logistic modeling, the zero-inflated binomial (ZIB) model that permits the estimation of the rate of false-negative errors and the correction of estimates of the probability of occurrence for false-negative errors by using repeated. visits to the same site. Our simulations show that even relatively low rates of false negatives bias statistical estimates of habitat effects. The method with three repeated visits eliminates the bias, but estimates are relatively imprecise. Six repeated visits improve precision of estimates to levels comparable to that achieved with conventional statistics in the absence of false-negative errors In general, when error rates are less than or equal to50% greater efficiency is gained by adding more sites, whereas when error rates are >50% it is better to increase the number of repeated visits. We highlight the flexibility of the method with three case studies, clearly demonstrating the effect of false-negative errors for a range of commonly used survey methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Reliability or validity studies are important for the evaluation of measurement error in dietary assessment methods. An approach to validation known as the method of triads uses triangulation techniques to calculate the validity coefficient of a food-frequency questionnaire (FFQ). Objective: To assess the validity of an FFQ estimates of carotenoid and vitamin E intake against serum biomarker measurements and weighed food records (WFRs), by applying the method of triads. Design: The study population was a sub-sample of adult participants in a randomised controlled trial of beta-carotene and sunscreen in the prevention of skin cancer. Dietary intake was assessed by a self-administered FFQ and a WFR. Nonfasting blood samples were collected and plasma analysed for five carotenoids (alpha-carotene, beta-carotene, beta-cryptoxanthin, lutein, lycopene) and vitamin E. Correlation coefficients were calculated between each of the dietary methods and the validity coefficient was calculated using the method of triads. The 95% confidence intervals for the validity coefficients were estimated using bootstrap sampling. Results: The validity coefficients of the FFQ were highest for alpha-carotene (0.85) and lycopene (0.62), followed by beta- carotene (0.55) and total carotenoids (0.55), while the lowest validity coefficient was for lutein (0.19). The method of triads could not be used for b- cryptoxanthin and vitamin E, as one of the three underlying correlations was negative. Conclusions: Results were similar to other studies of validity using biomarkers and the method of triads. For many dietary factors, the upper limit of the validity coefficients was less than 0.5 and therefore only strong relationships between dietary exposure and disease will be detected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A framework for developing marketing category management decision support systems (DSS) based upon the Bayesian Vector Autoregressive (BVAR) model is extended. Since the BVAR model is vulnerable to permanent and temporary shifts in purchasing patterns over time, a form that can correct for the shifts and still provide the other advantages of the BVAR is a Bayesian Vector Error-Correction Model (BVECM). We present the mechanics of extending the DSS to move from a BVAR model to the BVECM model for the category management problem. Several additional iterative steps are required in the DSS to allow the decision maker to arrive at the best forecast possible. The revised marketing DSS framework and model fitting procedures are described. Validation is conducted on a sample problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This letter presents an analytical model for evaluating the Bit Error Rate (BER) of a Direct Sequence Code Division Multiple Access (DS-CDMA) system, with M-ary orthogonal modulation and noncoherent detection, employing an array antenna operating in a Nakagami fading environment. An expression of the Signal to Interference plus Noise Ratio (SINR) at the output of the receiver is derived, which allows the BER to be evaluated using a closed form expression. The analytical model is validated by comparing the obtained results with simulation results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hysteresis models that eliminate the artificial pumping errors associated with the Kool-Parker (KP) soil moisture hysteresis model, such as the Parker-Lenhard (PL) method, can be computationally demanding in unsaturated transport models since they need to retain the wetting-drying history of the system. The pumping errors in these models need to be eliminated for correct simulation of cyclical systems (e.g. transport above a tidally forced watertable, infiltration and redistribution under periodic irrigation) if the soils exhibit significant hysteresis. A modification is made here to the PL method that allows it to be more readily applied to numerical models by eliminating the need to store a large number of soil moisture reversal points. The modified-PL method largely eliminates any artificial pumping error and so essentially retains the accuracy of the original PL approach. The modified-PL method is implemented in HYDRUS-1D (version 2.0), which is then used to simulate cyclic capillary fringe dynamics to show the influence of removing artificial pumping errors and to demonstrate the ease of implementation. Artificial pumping errors are shown to be significant for the soils and system characteristics used here in numerical experiments of transport above a fluctuating watertable. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is an expanded and more detailed version of the work [1] in which the Operator Quantum Error Correction formalism was introduced. This is a new scheme for the error correction of quantum operations that incorporates the known techniques - i.e. the standard error correction model, the method of decoherence-free subspaces, and the noiseless subsystem method - as special cases, and relies on a generalized mathematical framework for noiseless subsystems that applies to arbitrary quantum operations. We also discuss a number of examples and introduce the notion of unitarily noiseless subsystems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyse Gallager codes by employing a simple mean-field approximation that distorts the model geometry and preserves important interactions between sites. The method naturally recovers the probability propagation decoding algorithm as a minimization of a proper free-energy. We find a thermodynamical phase transition that coincides with information theoretical upper-bounds and explain the practical code performance in terms of the free-energy landscape.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The optimisation and scale-up of process conditions leading to high yields of recombinant proteins is an enduring bottleneck in the post-genomic sciences. Typical experiments rely on varying selected parameters through repeated rounds of trial-and-error optimisation. To rationalise this, several groups have recently adopted the 'design of experiments' (DoE) approach frequently used in industry. Studies have focused on parameters such as medium composition, nutrient feed rates and induction of expression in shake flasks or bioreactors, as well as oxygen transfer rates in micro-well plates. In this study we wanted to generate a predictive model that described small-scale screens and to test its scalability to bioreactors. Results Here we demonstrate how the use of a DoE approach in a multi-well mini-bioreactor permitted the rapid establishment of high yielding production phase conditions that could be transferred to a 7 L bioreactor. Using green fluorescent protein secreted from Pichia pastoris, we derived a predictive model of protein yield as a function of the three most commonly-varied process parameters: temperature, pH and the percentage of dissolved oxygen in the culture medium. Importantly, when yield was normalised to culture volume and density, the model was scalable from mL to L working volumes. By increasing pre-induction biomass accumulation, model-predicted yields were further improved. Yield improvement was most significant, however, on varying the fed-batch induction regime to minimise methanol accumulation so that the productivity of the culture increased throughout the whole induction period. These findings suggest the importance of matching the rate of protein production with the host metabolism. Conclusion We demonstrate how a rational, stepwise approach to recombinant protein production screens can reduce process development time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the forecasting accuracy of alternative vector autoregressive models each in a seven-variable system that comprises in turn of daily, weekly and monthly foreign exchange (FX) spot rates. The vector autoregressions (VARs) are in non-stationary, stationary and error-correction forms and are estimated using OLS. The imposition of Bayesian priors in the OLS estimations also allowed us to obtain another set of results. We find that there is some tendency for the Bayesian estimation method to generate superior forecast measures relatively to the OLS method. This result holds whether or not the data sets contain outliers. Also, the best forecasts under the non-stationary specification outperformed those of the stationary and error-correction specifications, particularly at long forecast horizons, while the best forecasts under the stationary and error-correction specifications are generally similar. The findings for the OLS forecasts are consistent with recent simulation results. The predictive ability of the VARs is very weak.