965 resultados para A posteriori error estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Elektrotechnik und Informationstechnik, Diss., 2010

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with the estimation of milk production by means of weekly, biweekly, bimonthly observations and also by method known as 6-5-8, where one observation is taken at the 6th week of lactation, another at 5th month and a third one at the 8th month. The data studied were obtained from 72 lactations of the Holstein Friesian breed of the "Escola Superior de Agricultura "Luiz de Queiroz" (Piracicaba), S. Paulo, Brazil), being 6 calvings on each month of year and also 12 first calvings, 12 second calvings, and so on, up to the sixth. The authors criticize the use of "maximum error" to be found in papers dealing with this subject, and also the use of mean deviation. The former is completely supersed and unadvisable and latter, although equivalent, to a certain extent, to the usual standard deviation, has only 87,6% of its efficiency, according to KENDALL (9, pp. 130-131, 10, pp. 6-7). The data obtained were compared with the actual production, obtained by daily control and the deviations observed were studied. Their means and standard deviations are given on the table IV. Inspite of BOX's recent results (11) showing that with equal numbers in all classes a certain inequality of varinces is not important, the autors separated the methods, before carrying out the analysis of variance, thus avoiding to put together methods with too different standard deviations. We compared the three first methods, to begin with (Table VI). Then we carried out the analysis with the four first methods. (Table VII). Finally we compared the two last methods. (Table VIII). These analysis of variance compare the arithmetic means of the deviations by the methods studied, and this is equivalent to compare their biases. So we conclude tht season of calving and order of calving do not effect the biases, and the methods themselves do not differ from this view point, with the exception of method 6-5-8. Another method of attack, maybe preferrable, would be to compare the estimates of the biases with their expected mean under the null hypothesis (zero) by the t-test. We have: 1) Weekley control: t = x - 0/c(x) = 8,59 - 0/ = 1,56 2) Biweekly control: t = 11,20 - 0/6,21= 1,80 3) Monthly control: t = 7,17 - 0/9,48 = 0,76 4) Bimonthly control: t = - 4,66 - 0/17,56 = -0,26 5) Method 6-5-8 t = 144,89 - 0/22,41 = 6,46*** We denote above by three asterisks, significance the 0,1% level of probability. In this way we should conclude that the weekly, biweekly, monthly and bimonthly methods of control may be assumed to be unbiased. The 6-5-8 method is proved to be positively biased, and here the bias equals 5,9% of the mean milk production. The precision of the methods studied may be judged by their standard deviations, or by intervals covering, with a certain probability (95% for example), the deviation x corresponding to an estimate obtained by cne of the methods studied. Since the difference x - x, where x is the mean of the 72 deviations obtained for each method, has a t distribution with mean zero and estimate of standard deviation. s(x - x) = √1+ 1/72 . s = 1.007. s , and the limit of t for the 5% probability, level with 71 degrees of freedom is 1.99, then the interval to be considered is given by x ± 1.99 x 1.007 s = x ± 2.00. s The intervals thus calculated are given on the table IX.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work describes a test tool that allows to make performance tests of different end-to-end available bandwidth estimation algorithms along with their different implementations. The goal of such tests is to find the best-performing algorithm and its implementation and use it in congestion control mechanism for high-performance reliable transport protocols. The main idea of this paper is to describe the options which provide available bandwidth estimation mechanism for highspeed data transport protocols and to develop basic functionality of such test tool with which it will be possible to manage entities of test application on all involved testing hosts, aided by some middleware.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Elektrotechnik und Informationstechnik, Diss., 2015

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Otto-von-Guericke-Universität Magdeburg, Fakultät für Mathematik, Univ., Dissertation, 2015

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to the hypothesis of Traub, also known as the 'formula of Traub', postmortem values of glucose and lactate found in the cerebrospinal fluid or vitreous humor are considered indicators of antemortem blood glucose levels. However, because the lactate concentration increases in the vitreous and cerebrospinal fluid after death, some authors postulated that using the sum value to estimate antemortem blood glucose levels could lead to an overestimation of the cases of glucose metabolic disorders with fatal outcomes, such as diabetic ketoacidosis. The aim of our study, performed on 470 consecutive forensic cases, was to ascertain the advantages of the sum value to estimate antemortem blood glucose concentrations and, consequently, to rule out fatal diabetic ketoacidosis as the cause of death. Other biochemical parameters, such as blood 3-beta-hydroxybutyrate, acetoacetate, acetone, glycated haemoglobin and urine glucose levels, were also determined. In addition, postmortem native CT scan, autopsy, histology, neuropathology and toxicology were performed to confirm diabetic ketoacidosis as the cause of death. According to our results, the sum value does not add any further information for the estimation of antemortem blood glucose concentration. The vitreous glucose concentration appears to be the most reliable marker to estimate antemortem hyperglycaemia and, along with the determination of other biochemical markers (such as blood acetone and 3-beta-hydroxybutyrate, urine glucose and glycated haemoglobin), to confirm diabetic ketoacidosis as the cause of death.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dispersal process, by which individuals or other dispersing agents such as gametes or seeds move from birthplace to a new settlement locality, has important consequences for the dynamics of genes, individuals, and species. Many of the questions addressed by ecology and evolutionary biology require a good understanding of species' dispersal patterns. Much effort has thus been devoted to overcoming the difficulties associated with dispersal measurement. In this context, genetic tools have long been the focus of intensive research, providing a great variety of potential solutions to measuring dispersal. This methodological diversity is reviewed here to help (molecular) ecologists find their way toward dispersal inference and interpretation and to stimulate further developments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Recommendations for statin use for primary prevention of coronary heart disease (CHD) are based on estimation of the 10- year CHD risk. We compared the 10-year CHD risk assessments and eligibility percentages for statin therapy using three scoring algorithms currently used in Europe. METHODS: We studied 5683 women and men, aged 35-75, without overt cardiovascular disease (CVD), in a population-based study in Switzerland. We compared the 10-year CHD risk using three scoring schemes, i.e., the Framingham risk score (FRS) from the U.S. National Cholesterol Education Program's Adult Treatment Panel III (ATP III), the PROCAM scoring scheme from the International Atherosclerosis Society (IAS), and the European risk SCORE for low-risk countries, without and with extrapolation to 60 years as recommended by the European Society of Cardiology guidelines (ESC). With FRS and PROCAM, high-risk was defined as a 10- year risk of fatal or non-fatal CHD>20% and a 10-year risk of fatal CVD≥5% with SCORE. We compared the proportions of high-risk participants and eligibility for statin use according to these three schemes. For each guideline, we estimated the impact of increased statin use from current partial compliance to full compliance on potential CHD deaths averted over 10 years, using a success proportion of 27% for statins. RESULTS: Participants classified at high-risk (both genders) were 5.8% according to FRS and 3.0% to the PROCAM, whereas the European risk SCORE classified 12.5% at high-risk (15.4% with extrapolation to 60 years). For the primary prevention of CHD, 18.5% of participants were eligible for statin therapy using ATP III, 16.6% using IAS, and 10.3% using ESC (13.0% with extrapolation) because ESC guidelines recommend statin therapy only in high-risk subjects. In comparison with IAS, agreement to identify eligible adults for statins was good with ATP III, but moderate with ESC. Using a population perspective, a full compliance with ATP III guidelines would reduce up to 17.9% of the 24′ 310 CHD deaths expected over 10 years in Switzerland, 17.3% with IAS and 10.8% with ESC (11.5% with extrapolation). CONCLUSIONS: Full compliance with guidelines for statin therapy would result in substantial health benefits, but proportions of high-risk adults and eligible adults for statin use varied substantially depending on the scoring systems and corresponding guidelines used for estimating CHD risk in Europe.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper does two things. First, it presents alternative approaches to the standard methods of estimating productive efficiency using a production function. It favours a parametric approach (viz. the stochastic production frontier approach) over a nonparametric approach (e.g. data envelopment analysis); and, further, one that provides a statistical explanation of efficiency, as well as an estimate of its magnitude. Second, it illustrates the favoured approach (i.e. the ‘single stage procedure’) with estimates of two models of explained inefficiency, using data from the Thai manufacturing sector, after the crisis of 1997. Technical efficiency is modelled as being dependent on capital investment in three major areas (viz. land, machinery and office appliances) where land is intended to proxy the effects of unproductive, speculative capital investment; and both machinery and office appliances are intended to proxy the effects of productive, non-speculative capital investment. The estimates from these models cast new light on the five-year long, post-1997 crisis period in Thailand, suggesting a structural shift from relatively labour intensive to relatively capital intensive production in manufactures from 1998 to 2002.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develops methods for Stochastic Search Variable Selection (currently popular with regression and Vector Autoregressive models) for Vector Error Correction models where there are many possible restrictions on the cointegration space. We show how this allows the researcher to begin with a single unrestricted model and either do model selection or model averaging in an automatic and computationally efficient manner. We apply our methods to a large UK macroeconomic model.