112 resultados para empirical predictors
Resumo:
Recent theoretical models of economic growth have emphasised the role of external effects on the accumulation of factors of production. Although most of the literature has considered the externalities across firms within a region, in this paper we go a step further and consider the possibility that these externalities cross the barriers of regional economies. We assess the role of these external effects in explaining growth and economic convergence. We present a simple growth model, which includes externalities across economies, developing a methodology for testing their existence and estimating their strength. In our view, spatial econometrics is naturally suited to an empirical consideration of these externalities. We obtain evidence on the presence of significant externalities both across Spanish and European regions.
Resumo:
In this paper we examine the effect of tax policy on the relationship between inequality and growth in a two-sector non-scale model. With non-scale models, the longrun equilibrium growth rate is determined by technological parameters and it is independent of macroeconomic policy instruments. However, this fact does not imply that fiscal policy is unimportant for long-run economic performance. It indeed has important effects on the different levels of key economic variables such as per capita stock of capital and output. Hence, although the economy grows at the same rate across steady states, the bases for economic growth may be different.The model has three essential features. First, we explicitly model skill accumulation, second, we introduce government finance into the production function, and we introduce an income tax to mirror the fiscal events of the 1980¿s and 1990¿s in the US. The fact that the non-scale model is associated with higher order dynamics enables it to replicate the distinctly non-linear nature of inequality in the US with relative ease. The results derived in this paper attract attention to the fact that the non-scale growth model does not only fit the US data well for the long-run (Jones, 1995b) but also that it possesses unique abilities in explaining short term fluctuations of the economy. It is shown that during transition the response of the relative simulated wage to changes in the tax code is rather non-monotonic, quite in accordance to the US inequality pattern in the 1980¿s and early 1990¿s.More specifically, we have analyzed in detail the dynamics following the simulation of an isolated tax decrease and an isolated tax increase. So, after a tax decrease the skill premium follows a lower trajectory than the one it would follow without a tax decrease. Hence we are able to reduce inequality for several periods after the fiscal shock. On the contrary, following a tax increase, the evolution of the skill premium remains above the trajectory carried on by the skill premium under a situation with no tax increase. Consequently, a tax increase would imply a higher level of inequality in the economy
Resumo:
This paper tests some hypothesis about the determinants of the local tax structure. In particular, we focus on the effects that the property tax deductibility in the national income tax has on the relative use of the property tax and user charges. We deal with the incentive effects that local governments face regarding the different sources of revenue by means of a model in which the local tax structure and the level of public expenditure arise as a result of the maximizing behaviour of local politicians subject to the economic effects of the tax system. We attempt to test the hypothesis developed with data corresponding to a set of Spanish municipalities during the period 1987-9l. We find that tax deductibility provides incentives to raise revenues from the property tax but does not introduce a biass against user charges or in favor of overall spending growth
Resumo:
To cosmic rays incident near the horizon the Earth's atmosphere represents a beam dump with a slant depth reaching 36 000 g cm-2 at 90. The prompt decay of a heavy quark produced by very high energy cosmic ray showers will leave an unmistakable signature in this dump. We translate the failure of experiments to detect such a signal into an upper limit on the heavy quark hadroproduction cross section in the energy region beyond existing accelerators. Our results disfavor any rapid growth of the cross section or the gluon structure function beyond conservative estimates based on perturbative QCD.
Resumo:
Background: To compare the characteristics and prognostic features of ischemic stroke in patients with diabetes and without diabetes, and to determine the independent predictors of in-hospital mortality in people with diabetes and ischemic stroke.Methods: Diabetes was diagnosed in 393 (21.3%) of 1,840 consecutive patients with cerebral infarction included in a prospective stroke registry over a 12-year period. Demographic characteristics, cardiovascular risk factors, clinical events, stroke subtypes, neuroimaging data, and outcome in ischemic stroke patients with and without diabetes were compared. Predictors of in-hospital mortality in diabetic patients with ischemic stroke were assessed by multivariate analysis. Results: People with diabetes compared to people without diabetes presented more frequently atherothrombotic stroke (41.2% vs 27%) and lacunar infarction (35.1% vs 23.9%) (P < 0.01). The in-hospital mortality in ischemic stroke patients with diabetes was 12.5% and 14.6% in those without (P = NS). Ischemic heart disease, hyperlipidemia, subacute onset, 85 years old or more, atherothrombotic and lacunar infarcts, and thalamic topography were independently associated with ischemic stroke in patients with diabetes, whereas predictors of in-hospital mortality included the patient's age, decreased consciousness, chronic nephropathy, congestive heart failure and atrial fibrillation. Conclusion: Ischemic stroke in people with diabetes showed a different clinical pattern from those without diabetes, with atherothrombotic stroke and lacunar infarcts being more frequent. Clinical factors indicative of the severity of ischemic stroke available at onset have a predominant influence upon in-hospital mortality and may help clinicians to assess prognosis more accurately.
Resumo:
Background: Mortality among patients who complete tuberculosis (TB) treatment is still high among vulnerable populations. The objective of the study was to identify the probability of death and its predictive factors in a cohort of successfully treated TB patients. Methods: A population-based retrospective longitudinal study was performed in Barcelona, Spain. All patients who successfully completed TB treatment with culture-confirmation and available drug susceptibility testing between 1995 1997 were retrospectively followed-up until December 31, 2005 by the Barcelona TB Control Program. Socio-demographic, clinical, microbiological and treatment variables were examined. Mortality, TB Program and AIDS registries were reviewed. Kaplan-Meier and a Cox regression methods with time-dependent covariates were used for the survival analysis, calculating the hazard ratio (HR) with 95% confidence intervals (CI). Results: Among the 762 included patients, the median age was 36 years, 520 (68.2%) were male, 178 (23.4%) HIV-infected, and 208 (27.3%) were alcohol abusers. Of the 134 (17.6%) injecting drug users (IDU), 123 (91.8%) were HIV-infected. A total of 30 (3.9%) recurrences and 173 deaths (22.7%) occurred (mortality rate: 3.4/100 person-years of follow-up). The predictors of death were: age between 4160 years old (HR: 3.5; CI:2.15.7), age greater than 60 years (HR: 14.6; CI:8.924), alcohol abuse (HR: 1.7; CI:1.22.4) and HIV-infected IDU (HR: 7.9; CI:4.713.3). Conclusions: The mortality rate among TB patients who completed treatment is associated with vulnerable populations such as the elderly, alcohol abusers, and HIV-infected IDU. We therefore need to fight against poverty, and promote and develop interventions and social policies directed towards these populations to improve their survival.
Resumo:
The prediction of rockfall travel distance below a rock cliff is an indispensable activity in rockfall susceptibility, hazard and risk assessment. Although the size of the detached rock mass may differ considerably at each specific rock cliff, small rockfall (<100 m3) is the most frequent process. Empirical models may provide us with suitable information for predicting the travel distance of small rockfalls over an extensive area at a medium scale (1:100 000¿1:25 000). "Solà d'Andorra la Vella" is a rocky slope located close to the town of Andorra la Vella, where the government has been documenting rockfalls since 1999. This documentation consists in mapping the release point and the individual fallen blocks immediately after the event. The documentation of historical rockfalls by morphological analysis, eye-witness accounts and historical images serve to increase available information. In total, data from twenty small rockfalls have been gathered which reveal an amount of a hundred individual fallen rock blocks. The data acquired has been used to check the reliability of the main empirical models widely adopted (reach and shadow angle models) and to analyse the influence of parameters which affecting the travel distance (rockfall size, height of fall along the rock cliff and volume of the individual fallen rock block). For predicting travel distances in maps with medium scales, a method has been proposed based on the "reach probability" concept. The accuracy of results has been tested from the line entailing the farthest fallen boulders which represents the maximum travel distance of past rockfalls. The paper concludes with a discussion of the application of both empirical models to other study areas.
Resumo:
Objecte: L'aplicació de la NIC 32 en les cooperatives ha generat una important controvèrsia en els últims anys. Fins al moment, s'han realitzat diversos treballs que intenten preveure els possibles efectes de la seva aplicació. Aquest treball pretén analitzar l'impacte de la primera aplicació de la NIC 32 en el sector cooperatiu. Disseny/metodologia/enfocament: S'ha seleccionat una mostra de 98 cooperatives, i s'ha realitzat una anàlisi comparativa de la seva informació financera presentada abans i després de l'aplicació de la NIC 32, per a determinar les diferències existents. S’ha utilitzat la prova de la suma de rangs de Wilcoxon per comprovar si aquestes diferències són significatives. També s’ha utilitzat la prova de la U de Mann Whitney per comprovar si existeixen diferències significatives en l’impacte relatiu de l’aplicació de la NIC 32 entre diversos grups de cooperatives. Finalment, s'ha realitzat una anàlisi dels efectes de l'aplicació de la NIC 32 en la situació patrimonial i econòmica de les cooperatives, i en l'evolució dels seus actius intangibles, mitjançant l’ús de tècniques d’anàlisi econòmico-financera. Aportacions i resultats: Els resultats obtinguts confirmen que l'aplicació de la NIC 32 provoca diferències significatives en algunes partides del balanç de situació i el compte de pèrdues i guanys, així com en les ràtios analitzades. Les principals diferències es concreten en una reducció del nivell de capitalització i un augment de l'endeutament de les cooperatives, així com un empitjorament general dels ràtios de solvència i autonomia financera. Limitacions: Cal tenir en compte que el treball s'ha realitzat amb una mostra de cooperatives que estan obligades a auditar els seus comptes anuals. Per tant, els resultats obtinguts han d'interpretar-se en un context de cooperatives de tamany elevat. També cal tenir en compte que hem realitzat una anàlisi comparativa dels comptes anuals de 2011 i 2010. Això ens ha permès conèixer les diferències en la informació financera de les cooperatives abans i després d'aplicar la NIC 32. Encara que algunes d’aquestes diferències també podrien estar causades per altres factors com la situació econòmica, els canvis en l'aplicació de les normes comptables, etc. Originalitat/valor afegit: Creiem que és el moment idoni per a realitzar aquest treball d'investigació, ja que des de 2011 totes les cooperatives espanyoles han d'aplicar les normes comptables adaptades a la NIC 32. A més, fins on coneixem, no existeixen altres treballs similars realitzats amb comptes anuals de cooperatives que ja han aplicat les normes comptables adaptades a la NIC 32 . Creiem que els resultats d'aquest treball d'investigació poden ser útils per a diferents grups d'interès. En primer lloc, perquè els organismes emissors de normes comptables puguin conèixer l'abast de la NIC 32 en les cooperatives i, puguin plantejar millores en el contingut de la norma. En segon lloc, perquè les pròpies cooperatives, federacions, confederacions i altres organismes cooperatius disposin d'informació sobre l'impacte econòmic de la primera aplicació de la NIC 32, i puguin realitzar les valoracions que creguin convenients. I en tercer lloc, perquè les entitats financeres, auditors i assessors de cooperatives i altres grups d'interès disposin d'informació sobre els canvis en els comptes anuals de les cooperatives, i puguin tenir-los en compte a l'hora de prendre decisions. Paraules clau: Cooperatives, patrimoni net, capital social, NIC 32, solvència, efectes de la normativa comptable, informació financera, ràtios.
Resumo:
The main goal of this article is to provide an answer to the question: "Does anything forecast exchange rates, and if so, which variables?". It is well known thatexchange rate fluctuations are very difficult to predict using economic models, andthat a random walk forecasts exchange rates better than any economic model (theMeese and Rogoff puzzle). However, the recent literature has identified a series of fundamentals/methodologies that claim to have resolved the puzzle. This article providesa critical review of the recent literature on exchange rate forecasting and illustratesthe new methodologies and fundamentals that have been recently proposed in an up-to-date, thorough empirical analysis. Overall, our analysis of the literature and thedata suggests that the answer to the question: "Are exchange rates predictable?" is,"It depends" -on the choice of predictor, forecast horizon, sample period, model, andforecast evaluation method. Predictability is most apparent when one or more of thefollowing hold: the predictors are Taylor rule or net foreign assets, the model is linear, and a small number of parameters are estimated. The toughest benchmark is therandom walk without drift.
Resumo:
In this work we explore the multivariate empirical mode decomposition combined with a Neural Network classifier as technique for face recognition tasks. Images are simultaneously decomposed by means of EMD and then the distance between the modes of the image and the modes of the representative image of each class is calculated using three different distance measures. Then, a neural network is trained using 10- fold cross validation in order to derive a classifier. Preliminary results (over 98 % of classification rate) are satisfactory and will justify a deep investigation on how to apply mEMD for face recognition.
Resumo:
Artifacts are present in most of the electroencephalography (EEG) recordings, making it difficult to interpret or analyze the data. In this paper a cleaning procedure based on a multivariate extension of empirical mode decomposition is used to improve the quality of the data. This is achieved by applying the cleaning method to raw EEG data. Then, a synchrony measure is applied on the raw and the clean data in order to compare the improvement of the classification rate. Two classifiers are used, linear discriminant analysis and neural networks. For both cases, the classification rate is improved about 20%.
Resumo:
Aquest estudi va analitzar la interacció del canvi organitzatiu, els valors culturals i el canvi tecnològic en el sistema sanitari català. L'estudi se subdivideix en cinc parts diferents. La primera és una anàlisi de contingut de webs relacionats amb la salut a Catalunya. La segona és un estudi dels usos d'Internet en qüestions relacionades amb la salut entre la població en general, les associacions de pacients i els professionals de la salut, i es basa en un sondeig per Internet adaptat a cada un d'aquests grups. La tercera part és un estudi de treball de camp dels programes experimentals duts a terme pel Govern català en diverses àrees i hospitals locals per a integrar electrònicament la història clínica dels pacients. La quarta és un estudi de les implicacions organitzatives de la introducció de sistemes d'informació en la gestió d'hospitals i centres d'assistència primària a l'Institut Català de Salut, el principal proveïdor de salut pública a Catalunya, i es basa en un sondeig per Internet i entrevistes en profunditat. La cinquena part és un estudi de cas dels efectes organitzatius i socials de la introducció de les tecnologies de la informació i la comunicació en un dels principals hospitals de Catalunya, l'Hospital Clínic de Barcelona. L'estudi es va dur a terme entre el maig del 2005 i el juliol del 2007.
Resumo:
Background: In longitudinal studies where subjects experience recurrent incidents over a period of time, such as respiratory infections, fever or diarrhea, statistical methods are required to take into account the within-subject correlation. Methods: For repeated events data with censored failure, the independent increment (AG), marginal (WLW) and conditional (PWP) models are three multiple failure models that generalize Cox"s proportional hazard model. In this paper, we revise the efficiency, accuracy and robustness of all three models under simulated scenarios with varying degrees of within-subject correlation, censoring levels, maximum number of possible recurrences and sample size. We also study the methods performance on a real dataset from a cohort study with bronchial obstruction. Results: We find substantial differences between methods and there is not an optimal method. AG and PWP seem to be preferable to WLW for low correlation levels but the situation reverts for high correlations. Conclusions: All methods are stable in front of censoring, worsen with increasing recurrence levels and share a bias problem which, among other consequences, makes asymptotic normal confidence intervals not fully reliable, although they are well developed theoretically.
Resumo:
This paper analyses the effect of R&D investment on firm growth. We use an extensive sample of Spanish manufacturing and service firms. The database comprises diverse waves of Spanish Community Innovation Survey and covers the period 2004–2008. First, a probit model corrected for sample selection analyses the role of innovation on the probability of being a high-growth firm (HGF). Second, a quantile regression technique is applied to explore the determinants of firm growth. Our database shows that a small number of firms experience fast growth rates in terms of sales or employees. Our results reveal that R&D investments positively affect the probability of becoming a HGF. However, differences appear between manufacturing and service firms. Finally, when we study the impact of R&D investment on firm growth, quantile estimations show that internal R&D presents a significant positive impact for the upper quantiles, while external R&D shows a significant positive impact up to the median. Keywords : High-growth firms, Firm growth, Innovation activity. JEL Classifications : L11, L25, L26, O30
Resumo:
Peer-reviewed