974 resultados para Estimate model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A sound statistical methodology is presented for modelling the correspondence between the characteristics of individuals, their thermal environment, and their thermal sensation. The proposed methodology substantially improves that developed by P.O. Fanger, by formulating a more general and precise model of thermal comfort. It enables us to estimate the model from a sample of data where all the parameters of comfort vary at the same time, which is not possible with that adopted by Fanger. Moreover, the present model is still valid when thermal conditions are far from optimum. (C) 1997 Elsevier Science Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time-lapse geophysical data acquired during transient hydrological experiments are being increasingly employed to estimate subsurface hydraulic properties at the field scale. In particular, crosshole ground-penetrating radar (GPR) data, collected while water infiltrates into the subsurface either by natural or artificial means, have been demonstrated in a number of studies to contain valuable information concerning the hydraulic properties of the unsaturated zone. Previous work in this domain has considered a variety of infiltration conditions and different amounts of time-lapse GPR data in the estimation procedure. However, the particular benefits and drawbacks of these different strategies as well as the impact of a variety of key and common assumptions remain unclear. Using a Bayesian Markov-chain-Monte-Carlo stochastic inversion methodology, we examine in this paper the information content of time-lapse zero-offset-profile (ZOP) GPR traveltime data, collected under three different infiltration conditions, for the estimation of van Genuchten-Mualem (VGM) parameters in a layered subsurface medium. Specifically, we systematically analyze synthetic and field GPR data acquired under natural loading and two rates of forced infiltration, and we consider the value of incorporating different amounts of time-lapse measurements into the estimation procedure. Our results confirm that, for all infiltration scenarios considered, the ZOP GPR traveltime data contain important information about subsurface hydraulic properties as a function of depth, with forced infiltration offering the greatest potential for VGM parameter refinement because of the higher stressing of the hydrological system. Considering greater amounts of time-lapse data in the inversion procedure is also found to help refine VGM parameter estimates. Quite importantly, however, inconsistencies observed in the field results point to the strong possibility that posterior uncertainties are being influenced by model structural errors, which in turn underlines the fundamental importance of a systematic analysis of such errors in future related studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use panel data from the U. S. Health and Retirement Study, 1992-2002, to estimate the effect of self-assessed health limitations on the active labor market participation of older men. Self-assessments of health are likely to be endogenous to labor supply due to justification bias and individual-specific heterogeneity in subjective evaluations. We address both concerns. We propose a semiparametric binary choice procedure that incorporates nonadditive correlated individual-specific effects. Our estimation strategy identifies and estimates the average partial effects of health and functioning on labor market participation. The results indicate that poor health plays a major role in labor market exit decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human arteries affected by atherosclerosis are characterized by altered wall viscoelastic properties. The possibility of noninvasively assessing arterial viscoelasticity in vivo would significantly contribute to the early diagnosis and prevention of this disease. This paper presents a noniterative technique to estimate the viscoelastic parameters of a vascular wall Zener model. The approach requires the simultaneous measurement of flow variations and wall displacements, which can be provided by suitable ultrasound Doppler instruments. Viscoelastic parameters are estimated by fitting the theoretical constitutive equations to the experimental measurements using an ARMA parameter approach. The accuracy and sensitivity of the proposed method are tested using reference data generated by numerical simulations of arterial pulsation in which the physiological conditions and the viscoelastic parameters of the model can be suitably varied. The estimated values quantitatively agree with the reference values, showing that the only parameter affected by changing the physiological conditions is viscosity, whose relative error was about 27% even when a poor signal-to-noise ratio is simulated. Finally, the feasibility of the method is illustrated through three measurements made at different flow regimes on a cylindrical vessel phantom, yielding a parameter mean estimation error of 25%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We reformulate the Smets-Wouters (2007) framework by embedding the theory of unemployment proposed in Galí (2011a,b). Weestimate the resulting model using postwar U.S. data, while treatingthe unemployment rate as an additional observable variable. Our approach overcomes the lack of identification of wage markup and laborsupply shocks highlighted by Chari, Kehoe and McGrattan (2008) intheir criticism of New Keynesian models, and allows us to estimate a"correct" measure of the output gap. In addition, the estimated modelcan be used to analyze the sources of unemployment fluctuations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An equation is applied for calculating the expected persistence time of an unstructured population of the white-toothed shrew Crocidura russula from Preverenges, a suburban area in western Switzerland. Population abundance data from March and November between 1977 and 1988 were fit to the logistic density dependence model to estimate mean population growth rate as a function of population density. The variance in mean growth rate was approximated with two different models. The largest estimated persistence time was less than a few decades, the smallest less than 10 years. The results are sensitive to the magnitude of variance in population growth rate. Deviations from the logistic density dependence model in November are quite well explained by weather variables but those in March are uncorrelated with weather variables. Variability in population growth rates measured in winter months may be better explained by behavioural mechanisms. Environmental variability, dispersal of juveniles and refugia within the range of the population may contribute to its long-term survival.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the advancement of phylogenetic methods to estimate speciation and extinction rates, their power can be limited under variable rates, in particular for clades with high extinction rates and small number of extant species. Fossil data can provide a powerful alternative source of information to investigate diversification processes. Here, we present PyRate, a computer program to estimate speciation and extinction rates and their temporal dynamics from fossil occurrence data. The rates are inferred in a Bayesian framework and are comparable to those estimated from phylogenetic trees. We describe how PyRate can be used to explore different models of diversification. In addition to the diversification rates, it provides estimates of the parameters of the preservation process (fossilization and sampling) and the times of speciation and extinction of each species in the data set. Moreover, we develop a new birth-death model to correlate the variation of speciation/extinction rates with changes of a continuous trait. Finally, we demonstrate the use of Bayes factors for model selection and show how the posterior estimates of a PyRate analysis can be used to generate calibration densities for Bayesian molecular clock analysis. PyRate is an open-source command-line Python program available at http://sourceforge.net/projects/pyrate/.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We estimate an open economy dynamic stochastic general equilibrium (DSGE)model of Australia with a number of shocks, frictions and rigidities, matching alarge number of observable time series. We find that both foreign and domesticshocks are important drivers of the Australian business cycle.We also find that theinitial impact on inflation of an increase in demand for Australian commoditiesis negative, due to an improvement in the real exchange rate, though there is apersistent positive effect on inflation that dominates at longer horizons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical {\sc vc} dimension, empirical {\sc vc} entropy, andmargin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is twofold. First, we study the determinants of economic growth among a wide set of potential variables for the Spanish provinces (NUTS3). Among others, we include various types of private, public and human capital in the group of growth factors. Also,we analyse whether Spanish provinces have converged in economic terms in recent decades. Thesecond objective is to obtain cross-section and panel data parameter estimates that are robustto model speci¯cation. For this purpose, we use a Bayesian Model Averaging (BMA) approach.Bayesian methodology constructs parameter estimates as a weighted average of linear regression estimates for every possible combination of included variables. The weight of each regression estimate is given by the posterior probability of each model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is twofold. First, we study the determinants of economic growth among a wide set of potential variables for the Spanish provinces (NUTS3). Among others, we include various types of private, public and human capital in the group of growth factors. Also,we analyse whether Spanish provinces have converged in economic terms in recent decades. Thesecond objective is to obtain cross-section and panel data parameter estimates that are robustto model speci¯cation. For this purpose, we use a Bayesian Model Averaging (BMA) approach.Bayesian methodology constructs parameter estimates as a weighted average of linear regression estimates for every possible combination of included variables. The weight of each regression estimate is given by the posterior probability of each model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Proctor test is time-consuming and requires sampling of several kilograms of soil. Proctor test parameters were predicted in Mollisols, Entisols and Vertisols of the Pampean region of Argentina under different management systems. They were estimated from a minimum number of readily available soil properties (soil texture, total organic C) and management (training data set; n = 73). The results were used to generate a soil compaction susceptibility model, which was subsequently validated using a second group of independent data (test data set; n = 24). Soil maximum bulk density was estimated as follows: Maximum bulk density (Mg m-3) = 1.4756 - 0.00599 total organic C (g kg-1) + 0.0000275 sand (g kg-1) + 0.0539 management. Management was equal to 0 for uncropped and untilled soils and 1 for conventionally tilled soils. The established models predicted the Proctor test parameters reasonably well, based on readily available soil properties. Tillage systems induced changes in the maximum bulk density regardless of total organic matter content or soil texture. The lower maximum apparent bulk density values under no-tillage require a revision of the relative compaction thresholds for different no-tillage crops.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In studies of the natural history of HIV-1 infection, the time scale of primary interest is the time since infection. Unfortunately, this time is very often unknown for HIV infection and using the follow-up time instead of the time since infection is likely to provide biased results because of onset confounding. Laboratory markers such as the CD4 T-cell count carry important information concerning disease progression and can be used to predict the unknown date of infection. Previous work on this topic has made use of only one CD4 measurement or based the imputation on incident patients only. However, because of considerable intrinsic variability in CD4 levels and because incident cases are different from prevalent cases, back calculation based on only one CD4 determination per person or on characteristics of the incident sub-cohort may provide unreliable results. Therefore, we propose a methodology based on the repeated individual CD4 T-cells marker measurements that use both incident and prevalent cases to impute the unknown date of infection. Our approach uses joint modelling of the time since infection, the CD4 time path and the drop-out process. This methodology has been applied to estimate the CD4 slope and impute the unknown date of infection in HIV patients from the Swiss HIV Cohort Study. A procedure based on the comparison of different slope estimates is proposed to assess the goodness of fit of the imputation. Results of simulation studies indicated that the imputation procedure worked well, despite the intrinsic high volatility of the CD4 marker.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Macroporosity is often used in the determination of soil compaction. Reduced macroporosity can lead to poor drainage, low root aeration and soil degradation. The aim of this study was to develop and test different models to estimate macro and microporosity efficiently, using multiple regression. Ten soils were selected within a large range of textures: sand (Sa) 0.07-0.84; silt 0.03-0.24; clay 0.13-0.78 kg kg-1 and subjected to three compaction levels (three bulk densities, BD). Two models with similar accuracy were selected, with a mean error of about 0.02 m³ m-3 (2 %). The model y = a + b.BD + c.Sa, named model 2, was selected for its simplicity to estimate Macro (Ma), Micro (Mi) or total porosity (TP): Ma = 0.693 - 0.465 BD + 0.212 Sa; Mi = 0.337 + 0.120 BD - 0.294 Sa; TP = 1.030 - 0.345 BD 0.082 Sa; porosity values were expressed in m³ m-3; BD in kg dm-3; and Sa in kg kg-1. The model was tested with 76 datum set of several other authors. An error of about 0.04 m³ m-3 (4 %) was observed. Simulations of variations in BD as a function of Sa are presented for Ma = 0 and Ma = 0.10 (10 %). The macroporosity equation was remodeled to obtain other compaction indexes: a) to simulate maximum bulk density (MBD) as a function of Sa (Equation 11), in agreement with literature data; b) to simulate relative bulk density (RBD) as a function of BD and Sa (Equation 13); c) another model to simulate RBD as a function of Ma and Sa (Equation 16), confirming the independence of this variable in relation to Sa for a fixed value of macroporosity and, also, proving the hypothesis of Hakansson & Lipiec that RBD = 0.87 corresponds approximately to 10 % macroporosity (Ma = 0.10 m³ m-3).