947 resultados para Estimated parameter
Resumo:
Die Bachelorarbeit behandelt die Schätzung der Parameter von Fluoreszenzlebensdauerfunktionen mit Hilfe des EM-Algorithmus. Dabei wird der Algorithmus sowohl auf simulierte als auch auf gemessene Daten angewandt. Die Schätzung der Parameter erfolgt zunächst global für die gesamte Probe mit Hilfe eines Simplex-Verfahrens, um dann das Verhältnis der Komponenten der Fluoreszenzlebensdauer, also die Wahrscheinlichkeit, mit der ein Photon von einer Komponente stammt, für jedes Pixel eines Bildes durch den EM-Algorithmus zu bestimmen. Die Messungen liegen als Anzahl der gemessenen Photonen in diskreten Zeitintervallen vor, dabei fehlt jedoch die Information, wie viele der Photonen in einem der Intervalle zu einer Komponente gehören. Durch die Nutzung bedingter Erwartungswerte ist der EM-Algorithmus in der Lage, ohne Verzerrung mit diesen unbekannten Daten umzugehen. Weiterhin wird die Schätzung dadurch erschwert, dass die Daten durch Faltung der Fluoreszenzlebensdauerfunktion mit einer so genannten Apparatefunktion zustandekommen und das Modell somit sehr komplex wird. Auch für dieses Problem wird im Laufe der Arbeit eine Lösung vorgestellt.
Resumo:
Magdeburg, Univ., Fak. für Elektrotechnik und Informationstechnik, Diss., 2014
Resumo:
Magdeburg, Univ., Med. Fak., Diss., 2014
Resumo:
This study has aims to determine the age and to estimate the growth parameters using scales of the species. Individuals of Piaractus mesopotamicus (Holmberg, 1887) used in this study were captured in the commercial fishery conducted in the region, along the year 2006. The model selected to express the growth of the species was the von Bertalanffy Sl= Sl∞*[1-exp-k(t-to)]. To determine if scales are suitable for studying the growth of pacu, we analyzed the relation between standard length (Sl) and the radius of the scales through linear regression. The period of annuli formation was determined analyzing the variations in the marginal increment and evaluating the consistency of the readings through the analysis of the coefficient of variations (CVs) for the average standard lengths of each age (number of rings) observed in the scales. The relationship between Ls of the fish and the radius of the scales showed that scales can be used to study the age and growth of P. mesopotamicus (R= 0.79). CVs were always below 20%, demonstrating the consistency of the readings. Annuli formation occurred in February, probably related to trophic migration that occurs in this month in the region. Equations that represents the growth in length obtained for P. mesopotamicus are Sl=50.00*[1-exp-0.18(t-(-3.00)] for males and Sl=59.23*[1-exp-0.14(t-(-3.36)] for females. The growth parameters obtained in this study were lower compared to other studies previously conducted for the same species and can related to overexploitation that species is submitted by fishing in the region. These values show also that females of pacu attain greater asymptotic length than males that growth faster.
Resumo:
The objective of this paper is to analyse to what extent the use of cross-section data will distort the estimated elasticities for car ownership demand when the observed variables do not correspond to a state equilibrium for some individuals in the sample. Our proposal consists of approximating the equilibrium values of the observed variables by constructing a pseudo-panel data set which entails averaging individuals observed at different points of time into cohorts. The results show that individual and aggregate data lead to almost the same value for income elasticity, whereas with respect to working adult elasticity the similarity is less pronounced.
Resumo:
Although polychlorinated biphenyls (PCBs) have been banned in many countries for more than three decades, exposures to PCBs continue to be of concern due to their long half-lives and carcinogenic effects. In National Institute for Occupational Safety and Health studies, we are using semiquantitative plant-specific job exposure matrices (JEMs) to estimate historical PCB exposures for workers (n = 24,865) exposed to PCBs from 1938 to 1978 at three capacitor manufacturing plants. A subcohort of these workers (n = 410) employed in two of these plants had serum PCB concentrations measured at up to four times between 1976 and 1989. Our objectives were to evaluate the strength of association between an individual worker's measured serum PCB levels and the same worker's cumulative exposure estimated through 1977 with the (1) JEM and (2) duration of employment, and to calculate the explained variance the JEM provides for serum PCB levels using (3) simple linear regression. Consistent strong and statistically significant associations were observed between the cumulative exposures estimated with the JEM and serum PCB concentrations for all years. The strength of association between duration of employment and serum PCBs was good for highly chlorinated (Aroclor 1254/HPCB) but not less chlorinated (Aroclor 1242/LPCB) PCBs. In the simple regression models, cumulative occupational exposure estimated using the JEMs explained 14-24% of the variance of the Aroclor 1242/LPCB and 22-39% for Aroclor 1254/HPCB serum concentrations. We regard the cumulative exposure estimated with the JEM as a better estimate of PCB body burdens than serum concentrations quantified as Aroclor 1242/LPCB and Aroclor 1254/HPCB.
Resumo:
This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.
Resumo:
This paper investigates underlying changes in the UK economy over the past thirtyfive years using a small open economy DSGE model. Using Bayesian analysis, we find UK monetary policy, nominal price rigidity and exogenous shocks, are all subject to regime shifting. A model incorporating these changes is used to estimate the realised monetary policy and derive the optimal monetary policy for the UK. This allows us to assess the effectiveness of the realised policy in terms of stabilising economic fluctuations, and, in turn, provide an indication of whether there is room for monetary authorities to further improve their policies.
Resumo:
In this paper we develop methods for estimation and forecasting in large timevarying parameter vector autoregressive models (TVP-VARs). To overcome computational constraints with likelihood-based estimation of large systems, we rely on Kalman filter estimation with forgetting factors. We also draw on ideas from the dynamic model averaging literature and extend the TVP-VAR so that its dimension can change over time. A final extension lies in the development of a new method for estimating, in a time-varying manner, the parameter(s) of the shrinkage priors commonly-used with large VARs. These extensions are operationalized through the use of forgetting factor methods and are, thus, computationally simple. An empirical application involving forecasting inflation, real output, and interest rates demonstrates the feasibility and usefulness of our approach.
Resumo:
Employing an endogenous growth model with human capital, this paper explores how productivity shocks in the goods and human capital producing sectors contribute to explaining aggregate fluctuations in output, consumption, investment and hours. Given the importance of accounting for both the dynamics and the trends in the data not captured by the theoretical growth model, we introduce a vector error correction model (VECM) of the measurement errors and estimate the model’s posterior density function using Bayesian methods. To contextualize our findings with those in the literature, we also assess whether the endogenous growth model or the standard real business cycle model better explains the observed variation in these aggregates. In addressing these issues we contribute to both the methods of analysis and the ongoing debate regarding the effects of innovations to productivity on macroeconomic activity.
Resumo:
This paper investigates the usefulness of switching Gaussian state space models as a tool for implementing dynamic model selecting (DMS) or averaging (DMA) in time-varying parameter regression models. DMS methods allow for model switching, where a different model can be chosen at each point in time. Thus, they allow for the explanatory variables in the time-varying parameter regression model to change over time. DMA will carry out model averaging in a time-varying manner. We compare our exact approach to DMA/DMS to a popular existing procedure which relies on the use of forgetting factor approximations. In an application, we use DMS to select different predictors in an in ation forecasting application. We also compare different ways of implementing DMA/DMS and investigate whether they lead to similar results.
Resumo:
An expanding literature articulates the view that Taylor rules are helpful in predicting exchange rates. In a changing world however, Taylor rule parameters may be subject to structural instabilities, for example during the Global Financial Crisis. This paper forecasts exchange rates using such Taylor rules with Time Varying Parameters (TVP) estimated by Bayesian methods. In core out-of-sample results, we improve upon a random walk benchmark for at least half, and for as many as eight out of ten, of the currencies considered. This contrasts with a constant parameter Taylor rule model that yields a more limited improvement upon the benchmark. In further results, Purchasing Power Parity and Uncovered Interest Rate Parity TVP models beat a random walk benchmark, implying our methods have some generality in exchange rate prediction.
Resumo:
In this paper, we forecast EU-area inflation with many predictors using time-varying parameter models. The facts that time-varying parameter models are parameter-rich and the time span of our data is relatively short motivate a desire for shrinkage. In constant coefficient regression models, the Bayesian Lasso is gaining increasing popularity as an effective tool for achieving such shrinkage. In this paper, we develop econometric methods for using the Bayesian Lasso with time-varying parameter models. Our approach allows for the coefficient on each predictor to be: i) time varying, ii) constant over time or iii) shrunk to zero. The econometric methodology decides automatically which category each coefficient belongs in. Our empirical results indicate the benefits of such an approach.
Resumo:
This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.
Resumo:
The financial impact of the first outbreak of Trypanosoma vivax in the Brazilian Pantanal wetland is estimated. Results are extended to include outbreaks in the Bolivian lowlands providing a notion of the potential influence of the disease and an analytical basis. More than 11 million head of cattle, valued at more than US$3 billion are found in the Brazilian Pantanal and Bolivian lowlands. The total estimated cost of the 1995 outbreak of T. vivax is the sum of the present values of mortality, abortion, and productivity losses and treatment costs, or about 4% of total brood cow value on affected ranches. Had the outbreak gone untreated, the estimated losses would have exceeded 17% of total brood cow value.