811 resultados para Algorithm Calibration
Resumo:
We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if thesequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor.
Resumo:
Se presenta un nuevo modelo integrado de evaluación para el stock norte-centro de la anchoveta peruana que permite reconstruir y hacer un seguimiento de la estructura de longitudes del stock desde un modelo basado en edades. El modelo fue calibrado usando estimados acústicos de biomasa y estructuras de tallas provenientes de cruceros científicos y de desembarques de la pesquería. Para la calibración se utilizó un algoritmo evolutivo con diferentes funciones de aptitud para cada variable calibrada (biomasas y capturas). Se presentan los estimados mensuales de biomasa total, biomasa desovante, reclutamiento y mortalidad por pesca obtenidos por el modelo de evaluación integrada para el periodo 1964-2008. Se encontraron tres periodos cualitativamente distintos en la dinámica de anchoveta, entre 1961-1971, 1971-1991 y 1991 al presente, que se distinguen tanto por las biomasas medias anuales como por los niveles de reclutamiento observado.
Resumo:
This paper compares two well known scan matching algorithms: the MbICP and the pIC. As a result of the study, it is proposed the MSISpIC, a probabilistic scan matching algorithm for the localization of an Autonomous Underwater Vehicle (AUV). The technique uses range scans gathered with a Mechanical Scanning Imaging Sonar (MSIS), and the robot displacement estimated through dead-reckoning with the help of a Doppler Velocity Log (DVL) and a Motion Reference Unit (MRU). The proposed method is an extension of the pIC algorithm. Its major contribution consists in: 1) using an EKF to estimate the local path traveled by the robot while grabbing the scan as well as its uncertainty and 2) proposing a method to group into a unique scan, with a convenient uncertainty model, all the data grabbed along the path described by the robot. The algorithm has been tested on an AUV guided along a 600m path within a marina environment with satisfactory results
Resumo:
Tiivistelmä: TDR-mittausten kalibrointi viljeltyjen turvemaiden kosteuden mittaamiseen
Resumo:
Accomplish high quality of final products in pharmaceutical industry is a challenge that requires the control and supervision of all the manufacturing steps. This request created the necessity of developing fast and accurate analytical methods. Near infrared spectroscopy together with chemometrics, fulfill this growing demand. The high speed providing relevant information and the versatility of its application to different types of samples lead these combined techniques as one of the most appropriated. This study is focused on the development of a calibration model able to determine amounts of API from industrial granulates using NIR, chemometrics and process spectra methodology.
Resumo:
Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo α equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadràtic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently
Resumo:
Summary Background: We previously derived a clinical prognostic algorithm to identify patients with pulmonary embolism (PE) who are at low-risk of short-term mortality who could be safely discharged early or treated entirely in an outpatient setting. Objectives: To externally validate the clinical prognostic algorithm in an independent patient sample. Methods: We validated the algorithm in 983 consecutive patients prospectively diagnosed with PE at an emergency department of a university hospital. Patients with none of the algorithm's 10 prognostic variables (age >/= 70 years, cancer, heart failure, chronic lung disease, chronic renal disease, cerebrovascular disease, pulse >/= 110/min., systolic blood pressure < 100 mm Hg, oxygen saturation < 90%, and altered mental status) at baseline were defined as low-risk. We compared 30-day overall mortality among low-risk patients based on the algorithm between the validation and the original derivation sample. We also assessed the rate of PE-related and bleeding-related mortality among low-risk patients. Results: Overall, the algorithm classified 16.3% of patients with PE as low-risk. Mortality at 30 days was 1.9% among low-risk patients and did not differ between the validation and the original derivation sample. Among low-risk patients, only 0.6% died from definite or possible PE, and 0% died from bleeding. Conclusions: This study validates an easy-to-use, clinical prognostic algorithm for PE that accurately identifies patients with PE who are at low-risk of short-term mortality. Low-risk patients based on our algorithm are potential candidates for less costly outpatient treatment.
Resumo:
The calculation of elasticity parameters by sonic and ultra sonic wave propagation in saturated soils using Biot's theory needs the following variables : forpiation density and porosity (p, ø), compressional and shear wave velocities (Vp, Vs), fluid density, viscosity and compressibility (Pfi Ilfi Ki), matrix density and compressibility (p" K), The first four parameters can be determined in situ using logging probes. Because fluid and matrix characteristics are not modified during core extraction, they can be obtained through laboratory measurements. All parameters necessitate precise calibrations in various environments and for specific range of values encountered in soils. The slim diameter of boreholes in shallow geophysics and the high cost of petroleum equipment demand the use of specific probes, which usually only give qualitative results. The measurement 'of density is done with a gamma-gamma probe and the measurement of hydrogen index, in relation to porosity, by a neutron probe. The first step of this work has been carried out in synthetic formations in the laboratory using homogeneous media of known density and porosity. To establish borehole corrections different casings have been used. Finally a comparison between laboratory and in situ data in cored holes of known geometry and casing has been performed.
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
Relationships between porosity and hydraulic conductivity tend to be strongly scale- and site-dependent and are thus very difficult to establish. As a result, hydraulic conductivity distributions inferred from geophysically derived porosity models must be calibrated using some measurement of aquifer response. This type of calibration is potentially very valuable as it may allow for transport predictions within the considered hydrological unit at locations where only geophysical measurements are available, thus reducing the number of well tests required and thereby the costs of management and remediation. Here, we explore this concept through a series of numerical experiments. Considering the case of porosity characterization in saturated heterogeneous aquifers using crosshole ground-penetrating radar and borehole porosity log data, we use tracer test measurements to calibrate a relationship between porosity and hydraulic conductivity that allows the best prediction of the observed hydrological behavior. To examine the validity and effectiveness of the obtained relationship, we examine its performance at alternate locations not used in the calibration procedure. Our results indicate that this methodology allows us to obtain remarkably reliable hydrological predictions throughout the considered hydrological unit based on the geophysical data only. This was also found to be the case when significant uncertainty was considered in the underlying relationship between porosity and hydraulic conductivity.
Resumo:
Images acquired using optical microscopes are inherently subject to vignetting effects due to imperfect illumination and image acquisition. However, such vignetting effects hamper accurate extraction of quantitative information from biological images, leading to less effective image segmentation and increased noise in the measurements. Here, we describe a rapid and effective method for vignetting correction, which generates an estimate for a correction function from the background fluorescence without the need to acquire additional calibration images. We validate the usefulness of this algorithm using artificially distorted images as a gold standard for assessing the accuracy of the applied correction and then demonstrate that this correction method enables the reliable detection of biologically relevant variation in cell populations. A simple user interface called FlattifY was developed and integrated into the image analysis platform YeastQuant to facilitate easy application of vignetting correction to a wide range of images.
Resumo:
We consider stochastic partial differential equations with multiplicative noise. We derive an algorithm for the computer simulation of these equations. The algorithm is applied to study domain growth of a model with a conserved order parameter. The numerical results corroborate previous analytical predictions obtained by linear analysis.