930 resultados para normalized heating parameter


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a novel online hidden Markov model (HMM) parameter estimator based on the new information-theoretic concept of one-step Kerridge inaccuracy (OKI). Under several regulatory conditions, we establish a convergence result (and some limited strong consistency results) for our proposed online OKI-based parameter estimator. In simulation studies, we illustrate the global convergence behaviour of our proposed estimator and provide a counter-example illustrating the local convergence of other popular HMM parameter estimators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The size and arrangement of stromal collagen fibrils (CFs) influence the optical properties of the cornea and hence its function. The spatial arrangement of the collagen is still questionable in relation to the diameter of collagen fibril. In the present study, we introduce a new parameter, edge-fibrillar distance (EFD) to measure how two collagen fibrils are spaced with respect to their closest edges and their spatial distribution through normalized standard deviation of EFD (NSDEFD) accessed through the application of two commercially available multipurpose solutions (MPS): ReNu and Hippia. The corneal buttons were soaked separately in ReNu and Hippia MPS for five hours, fixed overnight in 2.5% glutaraldehyde containing cuprolinic blue and processed for transmission electron microscopy. The electron micrographs were processed using ImageJ user-coded plugin. Statistical analysis was performed to compare the image processed equivalent diameter (ED), inter-fibrillar distance (IFD), and EFD of the CFs of treated versus normal corneas. The ReNu-soaked cornea resulted in partly degenerated epithelium with loose hemidesmosomes and Bowman’s collagen. In contrast, the epithelium of the cornea soaked in Hippia was degenerated or lost but showed closely packed Bowman’s collagen. Soaking the corneas in both MPS caused a statistically significant decrease in the anterior collagen fibril, ED and a significant change in IFD, and EFD than those of the untreated corneas (p < 0.05, for all comparisons). The introduction of EFD measurement in the study directly provided a sense of gap between periphery of the collagen bundles, their spatial distribution; and in combination with ED, they showed how the corneal collagen bundles are spaced in relation to their diameters. The spatial distribution parameter NSDEFD indicated that ReNu treated cornea fibrils were uniformly distributed spatially, followed by normal and Hippia. The EFD measurement with relatively lower standard deviation and NSDEFD, a characteristic of uniform CFs distribution, can be an additional parameter used in evaluating collagen organization and accessing the effects of various treatments on corneal health and transparency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The total entropy utility function is considered for the dual purpose of Bayesian design for model discrimination and parameter estimation. A sequential design setting is proposed where it is shown how to efficiently estimate the total entropy utility for a wide variety of data types. Utility estimation relies on forming particle approximations to a number of intractable integrals which is afforded by the use of the sequential Monte Carlo algorithm for Bayesian inference. A number of motivating examples are considered for demonstrating the performance of total entropy in comparison to utilities for model discrimination and parameter estimation. The results suggest that the total entropy utility selects designs which are efficient under both experimental goals with little compromise in achieving either goal. As such, the total entropy utility is advocated as a general utility for Bayesian design in the presence of model uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we have used simulations to make a conjecture about the coverage of a t-dimensional subspace of a d-dimensional parameter space of size n when performing k trials of Latin Hypercube sampling. This takes the form P(k,n,d,t) = 1 - e^(-k/n^(t-1)). We suggest that this coverage formula is independent of d and this allows us to make connections between building Populations of Models and Experimental Designs. We also show that Orthogonal sampling is superior to Latin Hypercube sampling in terms of allowing a more uniform coverage of the t-dimensional subspace at the sub-block size level. These ideas have particular relevance when attempting to perform uncertainty quantification and sensitivity analyses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stochastic (or random) processes are inherent to numerous fields of human endeavour including engineering, science, and business and finance. This thesis presents multiple novel methods for quickly detecting and estimating uncertainties in several important classes of stochastic processes. The significance of these novel methods is demonstrated by employing them to detect aircraft manoeuvres in video signals in the important application of autonomous mid-air collision avoidance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we provide estimates for the coverage of parameter space when using Latin Hypercube Sampling, which forms the basis of building so-called populations of models. The estimates are obtained using combinatorial counting arguments to determine how many trials, k, are needed in order to obtain specified parameter space coverage for a given value of the discretisation size n. In the case of two dimensions, we show that if the ratio (Ø) of trials to discretisation size is greater than 1, then as n becomes moderately large the fractional coverage behaves as 1-exp-ø. We compare these estimates with simulation results obtained from an implementation of Latin Hypercube Sampling using MATLAB.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper demonstrates the procedures for probabilistic assessment of a pesticide fate and transport model, PCPF-1, to elucidate the modeling uncertainty using the Monte Carlo technique. Sensitivity analyses are performed to investigate the influence of herbicide characteristics and related soil properties on model outputs using four popular rice herbicides: mefenacet, pretilachlor, bensulfuron-methyl and imazosulfuron. Uncertainty quantification showed that the simulated concentrations in paddy water varied more than those of paddy soil. This tendency decreased as the simulation proceeded to a later period but remained important for herbicides having either high solubility or a high 1st-order dissolution rate. The sensitivity analysis indicated that PCPF-1 parameters requiring careful determination are primarily those involve with herbicide adsorption (the organic carbon content, the bulk density and the volumetric saturated water content), secondary parameters related with herbicide mass distribution between paddy water and soil (1st-order desorption and dissolution rates) and lastly, those involving herbicide degradations. © Pesticide Science Society of Japan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report here on a series of laboratory experiments on plumes, undertaken with the object of simulating the effect of the heat release that occurs in clouds on condensation of water vapor. The experimental technique used for this purpose relies on ohmic heating generated in an electrically conducting plume fluid subjected to a suitable alternating voltage across specified axial stations in the plume flow [Bhat et al., 1989]. The present series of experiments achieves a value of the Richardson number that is toward the lower end of the range that characteristics cumulus clouds. It is found that the buoyancy enhancement due to heating disrupts the eddy structures in the flow and reduces the dilution owing to entrainment of ambient fluid that would otherwise have occurred in the central region of the plume. Heating also reduces the spread rate of the plume, but as it accelerates the flow as well, the overall specific mass flux in the plume does not show a very significant change at the heat input employed in the experiment. However, there is some indication that the entrainment rate (proportional to the streamwise derivative of the mass flux) is slightly higher immediately after heat injection and slightly lower farther downstream. The measurements support a previous proposal for a cloud scenario [Bhat and Narasimha, 1996] and demonstrate how fresh insights into certain aspects of the fluid dynamics of clouds may be derived from the experimental techniques employed here.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A microcontroller based, thermal energy meter cum controller (TEMC) suitable for solar thermal systems has been developed. It monitors solar radiation, ambient temperature, fluid flow rate, and temperature of fluid at various locations of the system and computes the energy transfer rate. It also controls the operation of the fluid-circulating pump depending on the temperature difference across the solar collector field. The accuracy of energy measurement is +/-1.5%. The instrument has been tested in a solar water heating system. Its operation became automatic with savings in electrical energy consumption of pump by 30% on cloudy days.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Biomechanical stresses play an important role in determining plaque stability. Quantification of these simulated stresses can be potentially used to assess plaque vulnerability and differentiate different patient groups. Methods and Results: 54 asymptomatic and 45 acutely symptomatic patients underwent in vivo multicontrast magnetic resonance imaging (MRI) of the carotid arteries. Plaque geometry used for finite element analysis was derived from in vivo MRI at the sites of maximum and minimum plaque burden. In total, 198 slices were used for the computational simulations. A pre-shrink technique was used to refine the simulation. Maximum principle stress at the vulnerable plaque sites (ie, critical stress) was extracted for the selected slices and a comparison was performed between the 2 groups. Critical stress in the slice with maximum plaque burden is significantly higher in acutely symptomatic patients as compared to asymptomatic patients (median, inter quartile range: 198.0 kPa (119.8-359.0 kPa) vs 138.4 kPa (83.8-242.6 kPa), P=0.04). No significant difference was found in the slice with minimum plaque burden between the 2 groups (196.7 kPa (133.3-282.7 kPa) vs 182.4 kPa (117.2-310.6 kPa), P=0.82). Conclusions: Acutely symptomatic carotid plaques have significantly high biomechanical stresses than asymptomatic plaques. This might be potentially useful for establishing a biomechanical risk stratification criteria based on plaque burden in future studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The LISA Parameter Estimation Taskforce was formed in September 2007 to provide the LISA Project with vetted codes, source distribution models and results related to parameter estimation. The Taskforce's goal is to be able to quickly calculate the impact of any mission design changes on LISA's science capabilities, based on reasonable estimates of the distribution of astrophysical sources in the universe. This paper describes our Taskforce's work on massive black-hole binaries (MBHBs). Given present uncertainties in the formation history of MBHBs, we adopt four different population models, based on (i) whether the initial black-hole seeds are small or large and (ii) whether accretion is efficient or inefficient at spinning up the holes. We compare four largely independent codes for calculating LISA's parameter-estimation capabilities. All codes are based on the Fisher-matrix approximation, but in the past they used somewhat different signal models, source parametrizations and noise curves. We show that once these differences are removed, the four codes give results in extremely close agreement with each other. Using a code that includes both spin precession and higher harmonics in the gravitational-wave signal, we carry out Monte Carlo simulations and determine the number of events that can be detected and accurately localized in our four population models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The method of generalized estimating equations (GEEs) provides consistent estimates of the regression parameters in a marginal regression model for longitudinal data, even when the working correlation model is misspecified (Liang and Zeger, 1986). However, the efficiency of a GEE estimate can be seriously affected by the choice of the working correlation model. This study addresses this problem by proposing a hybrid method that combines multiple GEEs based on different working correlation models, using the empirical likelihood method (Qin and Lawless, 1994). Analyses show that this hybrid method is more efficient than a GEE using a misspecified working correlation model. Furthermore, if one of the working correlation structures correctly models the within-subject correlations, then this hybrid method provides the most efficient parameter estimates. In simulations, the hybrid method's finite-sample performance is superior to a GEE under any of the commonly used working correlation models and is almost fully efficient in all scenarios studied. The hybrid method is illustrated using data from a longitudinal study of the respiratory infection rates in 275 Indonesian children.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address the issue of complexity for vector quantization (VQ) of wide-band speech LSF (line spectrum frequency) parameters. The recently proposed switched split VQ (SSVQ) method provides better rate-distortion (R/D) performance than the traditional split VQ (SVQ) method, even at the requirement of lower computational complexity. but at the expense of much higher memory. We develop the two stage SVQ (TsSVQ) method, by which we gain both the memory and computational advantages and still retain good R/D performance. The proposed TsSVQ method uses a full dimensional quantizer in its first stage for exploiting all the higher dimensional coding advantages and then, uses an SVQ method for quantizing the residual vector in the second stage so as to reduce the complexity. We also develop a transform domain residual coding method in this two stage architecture such that it further reduces the computational complexity. To design an effective residual codebook in the second stage, variance normalization of Voronoi regions is carried out which leads to the design of two new methods, referred to as normalized two stage SVQ (NTsSVQ) and normalized two stage transform domain SVQ (NTsTrSVQ). These two new methods have complimentary strengths and hence, they are combined in a switched VQ mode which leads to the further improvement in R/D performance, but retaining the low complexity requirement. We evaluate the performances of new methods for wide-band speech LSF parameter quantization and show their advantages over established SVQ and SSVQ methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is maintained that the one-parameter scaling theory is inconsistent with the physics of Anderson localisation.