998 resultados para DETERMINISTIC ESTIMATION


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study uses borehole geophysical log data of sonic velocity and electrical resistivity to estimate permeability in sandstones in the northern Galilee Basin, Queensland. The prior estimates of permeability are calculated according to the deterministic log–log linear empirical correlations between electrical resistivity and measured permeability. Both negative and positive relationships are influenced by the clay content. The prior estimates of permeability are updated in a Bayesian framework for three boreholes using both the cokriging (CK) method and a normal linear regression (NLR) approach to infer the likelihood function. The results show that the mean permeability estimated from the CK-based Bayesian method is in better agreement with the measured permeability when a fairly apparent linear relationship exists between the logarithm of permeability and sonic velocity. In contrast, the NLR-based Bayesian approach gives better estimates of permeability for boreholes where no linear relationship exists between logarithm permeability and sonic velocity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Monte Carlo model of ultrasound modulation of multiply scattered coherent light in a highly scattering media has been carried out for estimating the phase shift experienced by a photon beam on its transit through US insonified region. The phase shift is related to the tissue stiffness, thereby opening an avenue for possible breast tumor detection. When the scattering centers in the tissue medium is exposed to a deterministic forcing with the help of a focused ultrasound (US) beam, due to the fact that US-induced oscillation is almost along particular direction, the direction defined by the transducer axis, the scattering events increase, thereby increasing the phase shift experienced by light that traverses through the medium. The phase shift is found to increase with increase in anisotropy g of the medium. However, as the size of the focused region which is the region of interest (ROI) increases, a large number of scattering events take place within the ROI, the ensemble average of the phase shift (Delta phi) becomes very close to zero. The phase of the individual photon is randomly distributed over 2 pi when the scattered photon path crosses a large number of ultrasound wavelengths in the focused region. This is true at high ultrasound frequency (1 MHz) when mean free path length of photon l(s) is comparable to wavelength of US beam. However, at much lower US frequencies (100 Hz), the wavelength of sound is orders of magnitude larger than l(s), and with a high value of g (g 0.9), there is a distinct measurable phase difference for the photon that traverses through the insonified region. Experiments are carried out for validation of simulation results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of the paper is to develop a new method to estimate the maximum magnitude (M (max)) considering the regional rupture character. The proposed method has been explained in detail and examined for both intraplate and active regions. Seismotectonic data has been collected for both the regions, and seismic study area (SSA) map was generated for radii of 150, 300, and 500 km. The regional rupture character was established by considering percentage fault rupture (PFR), which is the ratio of subsurface rupture length (RLD) to total fault length (TFL). PFR is used to arrive RLD and is further used for the estimation of maximum magnitude for each seismic source. Maximum magnitude for both the regions was estimated and compared with the existing methods for determining M (max) values. The proposed method gives similar M (max) value irrespective of SSA radius and seismicity. Further seismicity parameters such as magnitude of completeness (M (c) ), ``a'' and ``aEuro parts per thousand b `` parameters and maximum observed magnitude (M (max) (obs) ) were determined for each SSA and used to estimate M (max) by considering all the existing methods. It is observed from the study that existing deterministic and probabilistic M (max) estimation methods are sensitive to SSA radius, M (c) , a and b parameters and M (max) (obs) values. However, M (max) determined from the proposed method is a function of rupture character instead of the seismicity parameters. It was also observed that intraplate region has less PFR when compared to active seismic region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A block-based motion estimation technique is proposed which permits a less general segmentation performed using an efficient deterministic algorithm. Applied to image pairs from the Flower Garden and Table Tennis sequences, the algorithm successfully localizes motion discontinuities and detects uncovered regions. The algorithm is implemented in C on a Sun Sparcstation 20. The gradient-based motion estimation required 28.8 s CPU time, and 500 iterations of the segmentation algorithm required 32.6 s.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The issue of smoothing in kriging has been addressed either by estimation or simulation. The solution via estimation calls for postprocessing kriging estimates in order to correct the smoothing effect. Stochastic simulation provides equiprobable images presenting no smoothing and reproducing the covariance model. Consequently, these images reproduce both the sample histogram and the sample semivariogram. However, there is still a problem, which is the lack of local accuracy of simulated images. In this paper, a postprocessing algorithm for correcting the smoothing effect of ordinary kriging estimates is compared with sequential Gaussian simulation realizations. Based on samples drawn from exhaustive data sets, the postprocessing algorithm is shown to be superior to any individual simulation realization yet, at the expense of providing one deterministic estimate of the random function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Homeostasis in the intact organism is achieved implicitly by repeated incremental feedback (inhibitory) and feedforward (stimulatory) adjustments enforced via intermittent signal exchange. In separated systems, neurohormone signals act deterministically on target cells via quantifiable effector-response functions. On the other hand, in vivo interglandular signaling dynamics have not been estimable to date. Indeed, experimentally isolating components of an interactive network definitionally disrupts time-sensitive linkages. We implement and validate analytical reconstruction of endogenous effector-response properties via a composite model comprising (i) a deterministic basic feedback and feedforward ensemble structure; (ii) judicious statistical allowance for possible stochastic variability in individual biologically interpretable dose–response properties; and (iii) the sole data requirement of serially observed concentrations of a paired signal (input) and response (output). Application of this analytical strategy to a prototypical neuroendocrine axis in the conscious uninjected horse, sheep, and human (i) illustrates probabilistic estimation of endogenous effector dose–response properties; and (ii) unmasks statistically vivid (2- to 5-fold) random fluctuations in inferred target-gland responsivity within any given pulse train. In conclusion, balanced mathematical formalism allows one to (i) reconstruct deterministic properties of interglandular signaling in the intact mammal and (ii) quantify apparent signal-response variability over short time scales in vivo. The present proof-of-principle experiments introduce a previously undescribed means to estimate time-evolving signal-response relationships without isotope infusion or pathway disruption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis reports on a quantitative exposure assessment and on an analysis of the attributes of the data used in the estimations, in particular distinguishing between its uncertainty and variability. A retrospective assessment of exposure to benzene was carried out for a case control study of leukaemia in the Australian petroleum industry. The study used the mean of personal task-based measurements (Base Estimates) in a deterministic algorithm and applied factors to model back to places, times etc for which no exposure measurements were available. Mean daily exposures were estimated, on an individual subject basis, by summing the task-based exposures. These mean exposures were multiplied by the years spent on each job to provide exposure estimates in ppm-years. These were summed to provide a Cumulative Estimate for each subject. Validation was completed for the model and key inputs. Exposures were low, most jobs were below TWA of 5 ppm benzene. Exposures in terminals were generally higher than at refineries. Cumulative Estimates ranged from 0.005 to 50.9 ppm-years, with 84 percent less than 10 ppm-years. Exposure probability distributions were developed for tanker drivers using Monte Carlo simulation of the exposure estimation algorithm. The outcome was a lognormal distribution of exposure for each driver. These provide the basis for alternative risk assessment metrics e.g. the frequency of short but intense exposures which provided only a minimal contribution to the long-term average exposure but may increase risk of leukaemia. The effect of different inputs to the model were examined and their significance assessed using Monte Carlo simulation. The Base Estimates were the most important determinant of exposure in the model. The sources of variability in the measured data were examined, including the effect of having censored data and the between and within-worker variability. The sources of uncertainty in the exposure estimates were analysed and consequential improvements in exposure assessment identified. Monte Carlo sampling was also used to examine the uncertainties and variability associated with the tanker drivers' exposure assessment, to derive an estimate of the range and to put confidence intervals on the daily mean exposures. The identified uncertainty was less than the variability associated with the estimates. The traditional approach to exposure estimation typically derives only point estimates of mean exposure. The approach developed here allows a range of exposure estimates to be made and provides a more flexible and improved basis for risk assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last couple of decades we assisted to a reappraisal of spatial design-based techniques. Usually the spatial information regarding the spatial location of the individuals of a population has been used to develop efficient sampling designs. This thesis aims at offering a new technique for both inference on individual values and global population values able to employ the spatial information available before sampling at estimation level by rewriting a deterministic interpolator under a design-based framework. The achieved point estimator of the individual values is treated both in the case of finite spatial populations and continuous spatial domains, while the theory on the estimator of the population global value covers the finite population case only. A fairly broad simulation study compares the results of the point estimator with the simple random sampling without replacement estimator in predictive form and the kriging, which is the benchmark technique for inference on spatial data. The Monte Carlo experiment is carried out on populations generated according to different superpopulation methods in order to manage different aspects of the spatial structure. The simulation outcomes point out that the proposed point estimator has almost the same behaviour as the kriging predictor regardless of the parameters adopted for generating the populations, especially for low sampling fractions. Moreover, the use of the spatial information improves substantially design-based spatial inference on individual values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wir betrachten einen zeitlich inhomogenen Diffusionsprozess, der durch eine stochastische Differentialgleichung gegeben wird, deren Driftterm ein deterministisches T-periodisches Signal beinhaltet, dessen Periodizität bekannt ist. Dieses Signal sei in einem Besovraum enthalten. Wir schätzen es mit Hilfe eines nichtparametrischen Waveletschätzers. Unser Schätzer ist von einem Wavelet-Dichteschätzer mit Thresholding inspiriert, der 1996 in einem klassischen iid-Modell von Donoho, Johnstone, Kerkyacharian und Picard konstruiert wurde. Unter gewissen Ergodizitätsvoraussetzungen an den Prozess können wir nichtparametrische Konvergenzraten angegeben, die bis auf einen logarithmischen Term den Raten im klassischen iid-Fall entsprechen. Diese Raten werden mit Hilfe von Orakel-Ungleichungen gezeigt, die auf Ergebnissen über Markovketten in diskreter Zeit von Clémencon, 2001, beruhen. Außerdem betrachten wir einen technisch einfacheren Spezialfall und zeigen einige Computersimulationen dieses Schätzers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was motivated by the need to improve densification of Global Horizontal Irradiance (GHI) observations, increasing the number of surface weather stations that observe it, using sensors with a sub-hour periodicity and examining the methods of spatial GHI estimation (by interpolation) with that periodicity in other locations. The aim of the present research project is to analyze the goodness of 15-minute GHI spatial estimations for five methods in the territory of Spain (three geo-statistical interpolation methods, one deterministic method and the HelioSat2 method, which is based on satellite images). The research concludes that, when the work area has adequate station density, the best method for estimating GHI every 15 min is Regression Kriging interpolation using GHI estimated from satellite images as one of the input variables. On the contrary, when station density is low, the best method is estimating GHI directly from satellite images. A comparison between the GHI observed by volunteer stations and the estimation model applied concludes that 67% of the volunteer stations analyzed present values within the margin of error (average of +-2 standard deviations).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whole brain resting state connectivity is a promising biomarker that might help to obtain an early diagnosis in many neurological diseases, such as dementia. Inferring resting-state connectivity is often based on correlations, which are sensitive to indirect connections, leading to an inaccurate representation of the real backbone of the network. The precision matrix is a better representation for whole brain connectivity, as it considers only direct connections. The network structure can be estimated using the graphical lasso (GL), which achieves sparsity through l1-regularization on the precision matrix. In this paper, we propose a structural connectivity adaptive version of the GL, where weaker anatomical connections are represented as stronger penalties on the corre- sponding functional connections. We applied beamformer source reconstruction to the resting state MEG record- ings of 81 subjects, where 29 were healthy controls, 22 were single-domain amnestic Mild Cognitive Impaired (MCI), and 30 were multiple-domain amnestic MCI. An atlas-based anatomical parcellation of 66 regions was ob- tained for each subject, and time series were assigned to each of the regions. The fiber densities between the re- gions, obtained with deterministic tractography from diffusion-weighted MRI, were used to define the anatomical connectivity. Precision matrices were obtained with the region specific time series in five different frequency bands. We compared our method with the traditional GL and a functional adaptive version of the GL, in terms of log-likelihood and classification accuracies between the three groups. We conclude that introduc- ing an anatomical prior improves the expressivity of the model and, in most cases, leads to a better classification between groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the estimation of a time-invariant channel spectrum from its own nonuniform samples, assuming there is a bound on the channel’s delay spread. Except for this last assumption, this is the basic estimation problem in systems providing channel spectral samples. However, as shown in the paper, the delay spread bound leads us to view the spectrum as a band-limited signal, rather than the Fourier transform of a tapped delay line (TDL). Using this alternative model, a linear estimator is presented that approximately minimizes the expected root-mean-square (RMS) error for a deterministic channel. Its main advantage over the TDL is that it takes into account the spectrum’s smoothness (time width), thus providing a performance improvement. The proposed estimator is compared numerically with the maximum likelihood (ML) estimator based on a TDL model in pilot-assisted channel estimation (PACE) for OFDM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 60J60, 62M99.