28 resultados para Internal working models

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objectives: The purpose of this study is to determine the possible differences in deflection between two needles of same length and external gauge but with different internal gauges during truncal block of the inferior alveolar nerve. The initial working hypothesis was that greater deflection may be expected with larger internal gauge needles. Study design: Four clinicians subjected 346 patients to inferior alveolar nerve block and infiltrating anesthesia of the buccal nerve trajectory for the surgical or conventional extraction of the lower third molar. A nonautoaspirating syringe system with 2 types of needle was used: a standard 27-gauge x 35-mm needle with an internal gauge of 0.215 mm or an XL Monoprotect® 27-gauge x 35-mm needle with an internal gauge of 0.265 mm. The following information was systematically recorded for each patient: needle type, gender, anesthetic technique (direct or indirect truncal block) and the number of bone contacts during the procedure, the patient-extraction side, the practitioner performing the technique, and blood aspiration (either positive or negative). Results: 346 needles were used in total. 190 were standard needles (27-gauge x 35-mm needle with an internal gauge of 0.215 mm) and 156 were XL Monoprotect®. Incidence of deflection was observed in 49.1% of cases (170 needles) where 94 were standard needles and 76 XL Monoprotect®. Needle torsion ranged from 0º and 6º. Conclusions: No significant differences were recorded in terms of deflection and internal gauge, operator, patient-extraction side, the anesthetic technique involved and the number of bone contacts during the procedure

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides empirical evidence that continuous time models with one factor of volatility, in some conditions, are able to fit the main characteristics of financial data. It also reports the importance of the feedback factor in capturing the strong volatility clustering of data, caused by a possible change in the pattern of volatility in the last part of the sample. We use the Efficient Method of Moments (EMM) by Gallant and Tauchen (1996) to estimate logarithmic models with one and two stochastic volatility factors (with and without feedback) and to select among them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyse the effects of investment decisions and firms' internal organisation on the efficiency and stability of horizontal mergers. In our framework economies of scale are endogenous and there might be internal conflict within merged firms. We show that often stable mergers do not lead to more efficiency and may even lead to efficiency losses. These mergers lead to lower total welfare, suggesting that a regulator should be careful in assuming that possible efficiency gains of a merger will be effiectively realised. Moreover, the paper offers a possible explanation for merger failures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When habits are introduced multiplicatively in a capital accumulation model, the consumers' objective function might fail to be concave. In this paper we provide conditions aimed at guaranteeing the existence of interior solutions to the consumers' problem. We also characterize the equilibrium path of two growth models with multiplicative habits: the internal habit formation model, where individual habits coincide with own past consumption, and the external habit formation (or catching-up with the Joneses) model, where habits arise from the average past consumption in the economy. We show that the introduction of external habits makes the equilibrium path inefficient during the transition towards the balanced growth path. We characterize in this context the optimal tax policy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transcripció de la intervenció del Sr. Gabriel Colomé en el Curs Universitari sobre Olimpisme que va organitzar el Centre d'Estudis Olí­mpics (CEO-UAB) el febrer de 1992. L'autor amb aquest text es proposa dos objectius principals: d'una banda, analitzar la influència de l'entorn sociopolí­tic sobre l'estructura organitzativa del Comitè Organitzador dels Jocs; de l'altra, veure com afecta el tipus de finançament en l'estructura i la infrastructura dels mateixos Jocs, i quines diferències hi ha entre els Jocs de 1972 i els següents fins a arribar a Barcelona.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is recent interest in the generalization of classical factor models in which the idiosyncratic factors are assumed to be orthogonal and there are identification restrictions on cross-sectional and time dimensions. In this study, we describe and implement a Bayesian approach to generalized factor models. A flexible framework is developed to determine the variations attributed to common and idiosyncratic factors. We also propose a unique methodology to select the (generalized) factor model that best fits a given set of data. Applying the proposed methodology to the simulated data and the foreign exchange rate data, we provide a comparative analysis between the classical and generalized factor models. We find that when there is a shift from classical to generalized, there are significant changes in the estimates of the structures of the covariance and correlation matrices while there are less dramatic changes in the estimates of the factor loadings and the variation attributed to common factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a contemporaneous-threshold multivariate smooth transition autoregressive (C-MSTAR) model in which the regime weights depend on the ex ante probabilities that latent regime-specific variables exceed certain threshold values. A key feature of the model is that the transition function depends on all the parameters of the model as well as on the data. Since the mixing weights are also a function of the regime-specific innovation covariance matrix, the model can account for contemporaneous regime-specific co-movements of the variables. The stability and distributional properties of the proposed model are discussed, as well as issues of estimation, testing and forecasting. The practical usefulness of the C-MSTAR model is illustrated by examining the relationship between US stock prices and interest rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider extensions of smooth transition autoregressive (STAR) models to situations where the threshold is a time-varying function of variables that affect the separation of regimes of the time series under consideration. Our specification is motivated by the observation that unusually high/low values for an economic variable may sometimes be best thought of in relative terms. State-dependent logistic STAR and contemporaneous-threshold STAR models are introduced and discussed. These models are also used to investigate the dynamics of U.S. short-term interest rates, where the threshold is allowed to be a function of past output growth and inflation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ghosh's model is discussed in this paper under two alternative scenarios. In an open version we compare it with Leontief's model and prove that they reduce to each other under some specific productive conditions. We then move onto reconsidering Ghosh's model alleged implausibility and we do so reformulating the model to incorporate a closure rule. The closure solves, to some extent, the implausibility problem very clearly put out by Oosterhaven for then value-added is correctly computed and responsive to allocation changes resulting from supply shocks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Joint-stability in interindustry models relates to the mutual simultaneous consistency of the demand-driven and supply-driven models of Leontief and Ghosh, respectively. Previous work has claimed joint-stability to be an acceptable assumption from the empirical viewpoint, provided only small changes in exogenous variables are considered. We show in this note, however, that the issue has deeper theoretical roots and offer an analytical demonstration that shows the impossibility of consistency between demand-driven and supply-driven models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Report for the scientific sojourn carried out at the University of California at Berkeley, from September to December 2007. Environmental niche modelling (ENM) techniques are powerful tools to predict species potential distributions. In the last ten years, a plethora of novel methodological approaches and modelling techniques have been developed. During three months, I stayed at the University of California, Berkeley, working under the supervision of Dr. David R. Vieites. The aim of our work was to quantify the error committed by these techniques, but also to test how an increase in the sample size affects the resultant predictions. Using MaxEnt software we generated distribution predictive maps, from different sample sizes, of the Eurasian quail (Coturnix coturnix) in the Iberian Peninsula. The quail is a generalist species from a climatic point of view, but an habitat specialist. The resultant distribution maps were compared with the real distribution of the species. This distribution was obtained from recent bird atlases from Spain and Portugal. Results show that ENM techniques can have important errors when predicting the species distribution of generalist species. Moreover, an increase of sample size is not necessary related with a better performance of the models. We conclude that a deep knowledge of the species’ biology and the variables affecting their distribution is crucial for an optimal modelling. The lack of this knowledge can induce to wrong conclusions.