985 resultados para Regulatory Models
Resumo:
This paper does two things. First, it presents alternative approaches to the standard methods of estimating productive efficiency using a production function. It favours a parametric approach (viz. the stochastic production frontier approach) over a nonparametric approach (e.g. data envelopment analysis); and, further, one that provides a statistical explanation of efficiency, as well as an estimate of its magnitude. Second, it illustrates the favoured approach (i.e. the ‘single stage procedure’) with estimates of two models of explained inefficiency, using data from the Thai manufacturing sector, after the crisis of 1997. Technical efficiency is modelled as being dependent on capital investment in three major areas (viz. land, machinery and office appliances) where land is intended to proxy the effects of unproductive, speculative capital investment; and both machinery and office appliances are intended to proxy the effects of productive, non-speculative capital investment. The estimates from these models cast new light on the five-year long, post-1997 crisis period in Thailand, suggesting a structural shift from relatively labour intensive to relatively capital intensive production in manufactures from 1998 to 2002.
Resumo:
Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting models as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output growth and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.
Resumo:
Report for the scientific sojourn carried out at the University of New South Wales from February to June the 2007. Two different biogeochemical models are coupled to a three dimensional configuration of the Princeton Ocean Model (POM) for the Northwestern Mediterranean Sea (Ahumada and Cruzado, 2007). The first biogeochemical model (BLANES) is the three-dimensional version of the model described by Bahamon and Cruzado (2003) and computes the nitrogen fluxes through six compartments using semi-empirical descriptions of biological processes. The second biogeochemical model (BIOMEC) is the biomechanical NPZD model described in Baird et al. (2004), which uses a combination of physiological and physical descriptions to quantify the rates of planktonic interactions. Physical descriptions include, for example, the diffusion of nutrients to phytoplankton cells and the encounter rate of predators and prey. The link between physical and biogeochemical processes in both models is expressed by the advection-diffusion of the non-conservative tracers. The similarities in the mathematical formulation of the biogeochemical processes in the two models are exploited to determine the parameter set for the biomechanical model that best fits the parameter set used in the first model. Three years of integration have been carried out for each model to reach the so called perpetual year run for biogeochemical conditions. Outputs from both models are averaged monthly and then compared to remote sensing images obtained from sensor MERIS for chlorophyll.
Resumo:
This paper develops methods for Stochastic Search Variable Selection (currently popular with regression and Vector Autoregressive models) for Vector Error Correction models where there are many possible restrictions on the cointegration space. We show how this allows the researcher to begin with a single unrestricted model and either do model selection or model averaging in an automatic and computationally efficient manner. We apply our methods to a large UK macroeconomic model.
Resumo:
1. Model-based approaches have been used increasingly in conservation biology over recent years. Species presence data used for predictive species distribution modelling are abundant in natural history collections, whereas reliable absence data are sparse, most notably for vagrant species such as butterflies and snakes. As predictive methods such as generalized linear models (GLM) require absence data, various strategies have been proposed to select pseudo-absence data. However, only a few studies exist that compare different approaches to generating these pseudo-absence data. 2. Natural history collection data are usually available for long periods of time (decades or even centuries), thus allowing historical considerations. However, this historical dimension has rarely been assessed in studies of species distribution, although there is great potential for understanding current patterns, i.e. the past is the key to the present. 3. We used GLM to model the distributions of three 'target' butterfly species, Melitaea didyma, Coenonympha tullia and Maculinea teleius, in Switzerland. We developed and compared four strategies for defining pools of pseudo-absence data and applied them to natural history collection data from the last 10, 30 and 100 years. Pools included: (i) sites without target species records; (ii) sites where butterfly species other than the target species were present; (iii) sites without butterfly species but with habitat characteristics similar to those required by the target species; and (iv) a combination of the second and third strategies. Models were evaluated and compared by the total deviance explained, the maximized Kappa and the area under the curve (AUC). 4. Among the four strategies, model performance was best for strategy 3. Contrary to expectations, strategy 2 resulted in even lower model performance compared with models with pseudo-absence data simulated totally at random (strategy 1). 5. Independent of the strategy model, performance was enhanced when sites with historical species presence data were not considered as pseudo-absence data. Therefore, the combination of strategy 3 with species records from the last 100 years achieved the highest model performance. 6. Synthesis and applications. The protection of suitable habitat for species survival or reintroduction in rapidly changing landscapes is a high priority among conservationists. Model-based approaches offer planning authorities the possibility of delimiting priority areas for species detection or habitat protection. The performance of these models can be enhanced by fitting them with pseudo-absence data relying on large archives of natural history collection species presence data rather than using randomly sampled pseudo-absence data.
Resumo:
This paper develops stochastic search variable selection (SSVS) for zero-inflated count models which are commonly used in health economics. This allows for either model averaging or model selection in situations with many potential regressors. The proposed techniques are applied to a data set from Germany considering the demand for health care. A package for the free statistical software environment R is provided.
Resumo:
Background Alzheimer's disease (AD) is the leading form of dementia worldwide. The Aß-peptide is believed to be the major pathogenic compound of the disease. Since several years it is hypothesized that Aß impacts the Wnt signaling cascade and therefore activation of this signaling pathway is proposed to rescue the neurotoxic effect of Aß. Findings Expression of the human Aß42 in the Drosophila nervous system leads to a drastically shortened life span. We found that the action of Aß42 specifically in the glutamatergic motoneurons is responsible for the reduced survival. However, we find that the morphology of the glutamatergic larval neuromuscular junctions, which are widely used as the model for mammalian central nervous system synapses, is not affected by Aß42 expression. We furthermore demonstrate that genetic activation of the Wnt signal transduction pathway in the nervous system is not able to rescue the shortened life span or a rough eye phenotype in Drosophila. Conclusions Our data confirm that the life span is a useful readout of Aß42 induced neurotoxicity in Drosophila; the neuromuscular junction seems however not to be an appropriate model to study AD in flies. Additionally, our results challenge the hypothesis that Wnt signaling might be implicated in Aß42 toxicity and might serve as a drug target against AD.
Resumo:
BACKGROUND: Zebrafish is a clinically-relevant model of heart regeneration. Unlike mammals, it has a remarkable heart repair capacity after injury, and promises novel translational applications. Amputation and cryoinjury models are key research tools for understanding injury response and regeneration in vivo. An understanding of the transcriptional responses following injury is needed to identify key players of heart tissue repair, as well as potential targets for boosting this property in humans. RESULTS: We investigated amputation and cryoinjury in vivo models of heart damage in the zebrafish through unbiased, integrative analyses of independent molecular datasets. To detect genes with potential biological roles, we derived computational prediction models with microarray data from heart amputation experiments. We focused on a top-ranked set of genes highly activated in the early post-injury stage, whose activity was further verified in independent microarray datasets. Next, we performed independent validations of expression responses with qPCR in a cryoinjury model. Across in vivo models, the top candidates showed highly concordant responses at 1 and 3 days post-injury, which highlights the predictive power of our analysis strategies and the possible biological relevance of these genes. Top candidates are significantly involved in cell fate specification and differentiation, and include heart failure markers such as periostin, as well as potential new targets for heart regeneration. For example, ptgis and ca2 were overexpressed, while usp2a, a regulator of the p53 pathway, was down-regulated in our in vivo models. Interestingly, a high activity of ptgis and ca2 has been previously observed in failing hearts from rats and humans. CONCLUSIONS: We identified genes with potential critical roles in the response to cardiac damage in the zebrafish. Their transcriptional activities are reproducible in different in vivo models of cardiac injury.
Resumo:
We propose an alternative approach to obtaining a permanent equilibrium exchange rate (PEER), based on an unobserved components (UC) model. This approach offers a number of advantages over the conventional cointegration-based PEER. Firstly, we do not rely on the prerequisite that cointegration has to be found between the real exchange rate and macroeconomic fundamentals to obtain non-spurious long-run relationships and the PEER. Secondly, the impact that the permanent and transitory components of the macroeconomic fundamentals have on the real exchange rate can be modelled separately in the UC model. This is important for variables where the long and short-run effects may drive the real exchange rate in opposite directions, such as the relative government expenditure ratio. We also demonstrate that our proposed exchange rate models have good out-of sample forecasting properties. Our approach would be a useful technique for central banks to estimate the equilibrium exchange rate and to forecast the long-run movements of the exchange rate.
Resumo:
This paper investigates the role of institutions in determining per capita income levels and growth. It contributes to the empirical literature by using different variables as proxies for institutions and by developing a deeper analysis of the issues arising from the use of weak and too many instruments in per capita income and growth regressions. The cross-section estimation suggests that institutions seem to matter, regardless if they are the only explanatory variable or are combined with geographical and integration variables, although most models suffer from the issue of weak instruments. The results from the growth models provides some interesting results: there is mixed evidence on the role of institutions and such evidence is more likely to be associated with law and order and investment profile; government spending is an important policy variable; collapsing the number of instruments results in fewer significant coefficients for institutions.
Resumo:
The classic organization of a gene structure has followed the Jacob and Monod bacterial gene model proposed more than 50 years ago. Since then, empirical determinations of the complexity of the transcriptomes found in yeast to human has blurred the definition and physical boundaries of genes. Using multiple analysis approaches we have characterized individual gene boundaries mapping on human chromosomes 21 and 22. Analyses of the locations of the 5' and 3' transcriptional termini of 492 protein coding genes revealed that for 85% of these genes the boundaries extend beyond the current annotated termini, most often connecting with exons of transcripts from other well annotated genes. The biological and evolutionary importance of these chimeric transcripts is underscored by (1) the non-random interconnections of genes involved, (2) the greater phylogenetic depth of the genes involved in many chimeric interactions, (3) the coordination of the expression of connected genes and (4) the close in vivo and three dimensional proximity of the genomic regions being transcribed and contributing to parts of the chimeric RNAs. The non-random nature of the connection of the genes involved suggest that chimeric transcripts should not be studied in isolation, but together, as an RNA network.
Resumo:
The revival of support for a living wage has reopened a long-run debate over the extent to which active regulation of labour markets may be necessary to attain desired outcomes. Market failure is suggested to result in lower wages and remuneration for low skilled workers than might otherwise be expected from models of perfect competition. This paper examines the theoretical underpinning of living wage campaigns and demonstrates that once we move away from idealised models of perfect competition to one where employers retain power over the bargaining process, such as monopsony, it is readily understandable that low wages may be endemic in low skilled employment contracts. The paper then examines evidence, derived from the UK Quarterly Labour Force Survey, for the extent to which a living wage will address low pay within the labour force. We highlight the greater incidence of low pay within the private sector and then focus upon the public sector where the Living Wage demand has had most impact. We examine the extent to which addressing low pay within the public sector increases costs. We further highlight the evidence that a predominance of low pay exists among public sector young and women workers (and in particular lone parent women workers) but not, perhaps surprisingly, among workers from ethnic minority backgrounds. The paper then builds upon the results from the Quarterly Labour Force Survey with analysis of the British Household Panel Survey in order to examine the impact the introduction of a living wage, within the public sector, would have in reducing household inequality. The paper concludes that a living wage is indeed an appropriate regulatory response to market failure for low skilled workers and can act to reduce age and gender pay inequality, and reduce household income inequality among in-work households below average earnings.
Resumo:
Isolated cytostatic lung perfusion (ILP) is an attractive technique allowing delivery of a high-dose of cytostatic agents to the lungs while limiting systemic toxicity. In developing a rat model of ILP, we have analysed the effect of the route of tumour cell injection on the source of tumour vessels. Pulmonary sarcomas were established by injecting a sarcoma cell suspension either by the intravenous (i.v.) route or directly into the lung parenchyma. Ink perfusion through either pulmonary artery (PA) or bronchial arteries (BA) was performed and the characteristics of the tumour deposits defined. i.v. and direct injection methods induced pulmonary sarcoma nodules, with similar histological features. The intraparenchymal injection of tumour cells resulted in more reliable and reproducible tumour growth and was associated with a longer survival of the animals. i.v. injected tumours developed a PA-derived vascular tree whereas directly injected tumours developed a BA-derived vasculature.
Resumo:
El projecte exposat té com a propòsit definir i implementar un model de simulació basat en la coordinació i assignació dels serveis d’emergència en accidents de trànsit. La definició del model s’ha realitzat amb l’ús de les Xarxes de Petri Acolorides i la implementació amb el software Rockwell Arena 7.0. El modelatge de la primera simulació ens mostra un model teòric basat en cues mentre que el segon, mostra un model més complet i real gràcies a la connexió mitjançant la plataforma Corba a una base de dades amb informació geogràfica de les flotes i de les rutes. Com a resultat de l’estudi i amb l’ajuda de GoogleEarth, podem realitzar simulacions gràfiques per veure els accidents generats, les flotes dels serveis i el moviment dels vehicles des de les bases fins als accidents.