971 resultados para maximum pseudolikelihood (MPL) estimation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a linear regression method for estimating Weibull parameters from life tests. The method uses stochastic models of the unreliability at each failure instant. As a result, a heteroscedastic regression problem arises that is solved by weighted least squares minimization. The main feature of our method is an innovative s-normalization of the failure data models, to obtain analytic expressions of centers and weights for the regression. The method has been Monte Carlo contrasted with Benard?s approximation, and Maximum Likelihood Estimation; and it has the highest global scores for its robustness, and performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Operational Modal Analysis of structures we often have multiple time history records of vibrations measured at different time instants. This work presents a procedure for estimating the modal parameters of the structure processing all the records, that is, using all available information to obtain a single estimate of the modal parameters. The method uses Maximum Likelihood Estimation and the Expectation Maximization algorithm. Finally, it has been applied to various problems for both simulated and real structures and the results show the advantage of the joint analysis proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multivariate analysis on flood variables is needed to design some hydraulic structures like dams, as the complexity of the routing process in a reservoir requires a representation of the full hydrograph. In this work, a bivariate copula model was used to obtain the bivariate joint distribution of flood peak and volume, in order to know the probability of occurrence of a given inflow hydrograph. However, the risk of dam overtopping is given by the maximum water elevation reached during the routing process, which depends on the hydrograph variables, the reservoir volume and the spillway crest length. Consequently, an additional bivariate return period, the so-called routed return period, was defined in terms of risk of dam overtopping based on this maximum water elevation obtained after routing the inflow hydrographs. The theoretical return periods, which give the probability of occurrence of a hydrograph prior to accounting for the reservoir routing, were compared with the routed return period, as in both cases hydrographs with the same probability will draw a curve in the peak-volume space. The procedure was applied to the case study of the Santillana reservoir in Spain. Different reservoir volumes and spillway lengths were considered to investigate the influence of the dam and reservoir characteristics on the results. The methodology improves the estimation of the Design Flood Hydrograph and can be applied to assess the risk of dam overtopping

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computing the modal parameters of large structures in Operational Modal Analysis often requires to process data from multiple non simultaneously recorded setups of sensors. These setups share some sensors in common, the so-called reference sensors that are fixed for all the measurements, while the other sensors are moved from one setup to the next. One possibility is to process the setups separately what result in different modal parameter estimates for each setup. Then the reference sensors are used to merge or glue the different parts of the mode shapes to obtain global modes, while the natural frequencies and damping ratios are usually averaged. In this paper we present a state space model that can be used to process all setups at once so the global mode shapes are obtained automatically and subsequently only a value for the natural frequency and damping ratio of each mode is computed. We also present how this model can be estimated using maximum likelihood and the Expectation Maximization algorithm. We apply this technique to real data measured at a footbridge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a general procedure for solving incomplete data estimation problems. The procedure can be used to find the maximum likelihood estimate or to solve estimating equations in difficult cases such as estimation with the censored or truncated regression model, the nonlinear structural measurement error model, and the random effects model. The procedure is based on the general principle of stochastic approximation and the Markov chain Monte-Carlo method. Applying the theory on adaptive algorithms, we derive conditions under which the proposed procedure converges. Simulation studies also indicate that the proposed procedure consistently converges to the maximum likelihood estimate for the structural measurement error logistic regression model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Frequencies of meiotic configurations in cytogenetic stocks are dependent on chiasma frequencies in segments defined by centromeres, breakpoints, and telomeres. The expectation maximization algorithm is proposed as a general method to perform maximum likelihood estimations of the chiasma frequencies in the intervals between such locations. The estimates can be translated via mapping functions into genetic maps of cytogenetic landmarks. One set of observational data was analyzed to exemplify application of these methods, results of which were largely concordant with other comparable data. The method was also tested by Monte Carlo simulation of frequencies of meiotic configurations from a monotelodisomic translocation heterozygote, assuming six different sample sizes. The estimate averages were always close to the values given initially to the parameters. The maximum likelihood estimation procedures can be extended readily to other kinds of cytogenetic stocks and allow the pooling of diverse cytogenetic data to collectively estimate lengths of segments, arms, and chromosomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with the estimation of a time-invariant channel spectrum from its own nonuniform samples, assuming there is a bound on the channel’s delay spread. Except for this last assumption, this is the basic estimation problem in systems providing channel spectral samples. However, as shown in the paper, the delay spread bound leads us to view the spectrum as a band-limited signal, rather than the Fourier transform of a tapped delay line (TDL). Using this alternative model, a linear estimator is presented that approximately minimizes the expected root-mean-square (RMS) error for a deterministic channel. Its main advantage over the TDL is that it takes into account the spectrum’s smoothness (time width), thus providing a performance improvement. The proposed estimator is compared numerically with the maximum likelihood (ML) estimator based on a TDL model in pilot-assisted channel estimation (PACE) for OFDM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present biogenic opal flux records from two deep-sea sites in the Scotia Sea (MD07-3133 and MD07-3134) at decadal-scale resolution, covering the last glacial cycle. Besides conventional and time-consuming biogenic opal measuring methods, we introduce new biogenic opal estimation methods derived from sediment colour b*, wet bulk density, Si/Ti-count ratio, and Fourier transform infrared spectroscopy (FTIRS). All methods capture the biogenic opal amplitude, however, FTIRS - a novel method for marine sediment - yields the most reliable results. 230Th normalization data show strong differences in sediment focusing with intensified sediment focusing during glacial times. At MD07-3134 230Th normalized biogenic opal fluxes vary between 0.2 and 2.5 g/cm2/kyr. Our biogenic opal flux records indicate bioproductivity changes in the Southern Ocean, strongly influenced by sea ice distribution and also summer sea surface temperature changes. South of the Antarctic Polar Front, lowest bioproductivity occurred during the Last Glacial Maximum when upwelling of mid-depth water was reduced and sea ice cover intensified. Around 17 ka, bioproductivity increased abruptly, corresponding to rising atmospheric CO2 contents and decreasing seasonal sea ice coverage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transportation Department, Office of Systems Engineering, Washington, D.C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evidence indicates that cruciferous vegetables are protective against a range of cancers with glucosinolates and their breakdown products considered the biologically active constituents. To date, epidemiological studies have not investigated the intakes of these constituents due to a lack of food composition databases. The aim of the present study was to develop a database for the glucosinolate content of cruciferous vegetables that can be used to quantify dietary exposure for use in epidemiological studies of diet-disease relationships. Published food composition data sources for the glucosinolate content of cruciferous vegetables were identified and assessed for data quality using established criteria. Adequate data for the total glucosinolate content were available from eighteen published studies providing 140 estimates for forty-two items. The highest glucosinolate values were for cress (389 mg/100 g) while the lowest values were for Pe-tsai chinese cabbage (20 mg/100 g). There is considerable variation in the values reported for the same vegetable by different studies, with a median difference between the minimum and maximum values of 5.8-fold. Limited analysis of cooked cruciferous vegetables has been conducted; however, the available data show that average losses during cooking are approximately 36 %. This is the first attempt to collate the available literature on the glucosinolate content of cruciferous vegetables. These data will allow quantification of intakes of the glucosinolates, which can be used in epidemiological studies to investigate the role of cruciferous vegetables in cancer aetiology and prevention.