911 resultados para Models of Internationalization
Resumo:
This paper does two things. First, it presents alternative approaches to the standard methods of estimating productive efficiency using a production function. It favours a parametric approach (viz. the stochastic production frontier approach) over a nonparametric approach (e.g. data envelopment analysis); and, further, one that provides a statistical explanation of efficiency, as well as an estimate of its magnitude. Second, it illustrates the favoured approach (i.e. the ‘single stage procedure’) with estimates of two models of explained inefficiency, using data from the Thai manufacturing sector, after the crisis of 1997. Technical efficiency is modelled as being dependent on capital investment in three major areas (viz. land, machinery and office appliances) where land is intended to proxy the effects of unproductive, speculative capital investment; and both machinery and office appliances are intended to proxy the effects of productive, non-speculative capital investment. The estimates from these models cast new light on the five-year long, post-1997 crisis period in Thailand, suggesting a structural shift from relatively labour intensive to relatively capital intensive production in manufactures from 1998 to 2002.
Resumo:
BACKGROUND: Zebrafish is a clinically-relevant model of heart regeneration. Unlike mammals, it has a remarkable heart repair capacity after injury, and promises novel translational applications. Amputation and cryoinjury models are key research tools for understanding injury response and regeneration in vivo. An understanding of the transcriptional responses following injury is needed to identify key players of heart tissue repair, as well as potential targets for boosting this property in humans. RESULTS: We investigated amputation and cryoinjury in vivo models of heart damage in the zebrafish through unbiased, integrative analyses of independent molecular datasets. To detect genes with potential biological roles, we derived computational prediction models with microarray data from heart amputation experiments. We focused on a top-ranked set of genes highly activated in the early post-injury stage, whose activity was further verified in independent microarray datasets. Next, we performed independent validations of expression responses with qPCR in a cryoinjury model. Across in vivo models, the top candidates showed highly concordant responses at 1 and 3 days post-injury, which highlights the predictive power of our analysis strategies and the possible biological relevance of these genes. Top candidates are significantly involved in cell fate specification and differentiation, and include heart failure markers such as periostin, as well as potential new targets for heart regeneration. For example, ptgis and ca2 were overexpressed, while usp2a, a regulator of the p53 pathway, was down-regulated in our in vivo models. Interestingly, a high activity of ptgis and ca2 has been previously observed in failing hearts from rats and humans. CONCLUSIONS: We identified genes with potential critical roles in the response to cardiac damage in the zebrafish. Their transcriptional activities are reproducible in different in vivo models of cardiac injury.
Resumo:
The authors investigated the dimensionality of the French version of the Rosenberg Self-Esteem Scale (RSES; Rosenberg, 1965) using confirmatory factor analysis. We tested models of 1 or 2 factors. Results suggest the RSES is a 1-dimensional scale with 3 highly correlated items. Comparison with the Revised NEO-Personality Inventory (NEO-PI-R; Costa, McCrae, & Rolland, 1998) demonstrated that Neuroticism correlated strongly and Extraversion and Conscientiousness moderately with the RSES. Depression accounted for 47% of the variance of the RSES. Other NEO-PI-R facets were also moderately related with self-esteem.
Resumo:
This paper examines both the in-sample and out-of-sample performance of three monetary fundamental models of exchange rates and compares their out-of-sample performance to that of a simple Random Walk model. Using a data-set consisting of five currencies at monthly frequency over the period January 1980 to December 2009 and a battery of newly developed performance measures, the paper shows that monetary models do better (in-sample and out-of-sample forecasting) than a simple Random Walk model.
Resumo:
We survey the main theoretical aspects of models for Mobile Ad Hoc Networks (MANETs). We present theoretical characterizations of mobile network structural properties, different dynamic graph models of MANETs, and finally we give detailed summaries of a few selected articles. In particular, we focus on articles dealing with connectivity of mobile networks, and on articles which show that mobility can be used to propagate information between nodes of the network while at the same time maintaining small transmission distances, and thus saving energy.
Resumo:
Using numerical simulations we investigate shapes of random equilateral open and closed chains, one of the simplest models of freely fluctuating polymers in a solution. We are interested in the 3D density distribution of the modeled polymers where the polymers have been aligned with respect to their three principal axes of inertia. This type of approach was pioneered by Theodorou and Suter in 1985. While individual configurations of the modeled polymers are almost always nonsymmetric, the approach of Theodorou and Suter results in cumulative shapes that are highly symmetric. By taking advantage of asymmetries within the individual configurations, we modify the procedure of aligning independent configurations in a way that shows their asymmetry. This approach reveals, for example, that the 3D density distribution for linear polymers has a bean shape predicted theoretically by Kuhn. The symmetry-breaking approach reveals complementary information to the traditional, symmetrical, 3D density distributions originally introduced by Theodorou and Suter.
Resumo:
Cette thèse s'intéresse à étudier les propriétés extrémales de certains modèles de risque d'intérêt dans diverses applications de l'assurance, de la finance et des statistiques. Cette thèse se développe selon deux axes principaux, à savoir: Dans la première partie, nous nous concentrons sur deux modèles de risques univariés, c'est-à- dire, un modèle de risque de déflation et un modèle de risque de réassurance. Nous étudions le développement des queues de distribution sous certaines conditions des risques commun¬s. Les principaux résultats sont ainsi illustrés par des exemples typiques et des simulations numériques. Enfin, les résultats sont appliqués aux domaines des assurances, par exemple, les approximations de Value-at-Risk, d'espérance conditionnelle unilatérale etc. La deuxième partie de cette thèse est consacrée à trois modèles à deux variables: Le premier modèle concerne la censure à deux variables des événements extrême. Pour ce modèle, nous proposons tout d'abord une classe d'estimateurs pour les coefficients de dépendance et la probabilité des queues de distributions. Ces estimateurs sont flexibles en raison d'un paramètre de réglage. Leurs distributions asymptotiques sont obtenues sous certaines condi¬tions lentes bivariées de second ordre. Ensuite, nous donnons quelques exemples et présentons une petite étude de simulations de Monte Carlo, suivie par une application sur un ensemble de données réelles d'assurance. L'objectif de notre deuxième modèle de risque à deux variables est l'étude de coefficients de dépendance des queues de distributions obliques et asymétriques à deux variables. Ces distri¬butions obliques et asymétriques sont largement utiles dans les applications statistiques. Elles sont générées principalement par le mélange moyenne-variance de lois normales et le mélange de lois normales asymétriques d'échelles, qui distinguent la structure de dépendance de queue comme indiqué par nos principaux résultats. Le troisième modèle de risque à deux variables concerne le rapprochement des maxima de séries triangulaires elliptiques obliques. Les résultats théoriques sont fondés sur certaines hypothèses concernant le périmètre aléatoire sous-jacent des queues de distributions. -- This thesis aims to investigate the extremal properties of certain risk models of interest in vari¬ous applications from insurance, finance and statistics. This thesis develops along two principal lines, namely: In the first part, we focus on two univariate risk models, i.e., deflated risk and reinsurance risk models. Therein we investigate their tail expansions under certain tail conditions of the common risks. Our main results are illustrated by some typical examples and numerical simu¬lations as well. Finally, the findings are formulated into some applications in insurance fields, for instance, the approximations of Value-at-Risk, conditional tail expectations etc. The second part of this thesis is devoted to the following three bivariate models: The first model is concerned with bivariate censoring of extreme events. For this model, we first propose a class of estimators for both tail dependence coefficient and tail probability. These estimators are flexible due to a tuning parameter and their asymptotic distributions are obtained under some second order bivariate slowly varying conditions of the model. Then, we give some examples and present a small Monte Carlo simulation study followed by an application on a real-data set from insurance. The objective of our second bivariate risk model is the investigation of tail dependence coefficient of bivariate skew slash distributions. Such skew slash distributions are extensively useful in statistical applications and they are generated mainly by normal mean-variance mixture and scaled skew-normal mixture, which distinguish the tail dependence structure as shown by our principle results. The third bivariate risk model is concerned with the approximation of the component-wise maxima of skew elliptical triangular arrays. The theoretical results are based on certain tail assumptions on the underlying random radius.
Resumo:
Gene expression data from microarrays are being applied to predict preclinical and clinical endpoints, but the reliability of these predictions has not been established. In the MAQC-II project, 36 independent teams analyzed six microarray data sets to generate predictive models for classifying a sample with respect to one of 13 endpoints indicative of lung or liver toxicity in rodents, or of breast cancer, multiple myeloma or neuroblastoma in humans. In total, >30,000 models were built using many combinations of analytical methods. The teams generated predictive models without knowing the biological meaning of some of the endpoints and, to mimic clinical reality, tested the models on data that had not been used for training. We found that model performance depended largely on the endpoint and team proficiency and that different approaches generated models of similar performance. The conclusions and recommendations from MAQC-II should be useful for regulatory agencies, study committees and independent investigators that evaluate methods for global gene expression analysis.
Resumo:
This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified
Resumo:
This paper investigates the role of learning by private agents and the central bank(two-sided learning) in a New Keynesian framework in which both sides of the economyhave asymmetric and imperfect knowledge about the true data generating process. Weassume that all agents employ the data that they observe (which may be distinct fordifferent sets of agents) to form beliefs about unknown aspects of the true model ofthe economy, use their beliefs to decide on actions, and revise these beliefs througha statistical learning algorithm as new information becomes available. We study theshort-run dynamics of our model and derive its policy recommendations, particularlywith respect to central bank communications. We demonstrate that two-sided learningcan generate substantial increases in volatility and persistence, and alter the behaviorof the variables in the model in a significant way. Our simulations do not convergeto a symmetric rational expectations equilibrium and we highlight one source thatinvalidates the convergence results of Marcet and Sargent (1989). Finally, we identifya novel aspect of central bank communication in models of learning: communicationcan be harmful if the central bank's model is substantially mis-specified.
Resumo:
Here I develop a model of a radiative-convective atmosphere with both radiative and convective schemes highly simplified. The atmospheric absorption of radiation at selective wavelengths makes use of constant mass absorption coefficients in finite width spectral bands. The convective regime is introduced by using a prescribed lapse rate in the troposphere. The main novelty of the radiative-convective model developed here is that it is solved without using any angular approximation for the radiation field. The solution obtained in the purely radiation mode (i. e. with convection ignored) leads to multiple equilibria of stable states, being very similar to some results recently found in simple models of planetary atmospheres. However, the introduction of convective processes removes the multiple equilibria of stable states. This shows the importance of taking convective processes into account even for qualitative analyses of planetary atmosphere
Resumo:
1. Identifying those areas suitable for recolonization by threatened species is essential to support efficient conservation policies. Habitat suitability models (HSM) predict species' potential distributions, but the quality of their predictions should be carefully assessed when the species-environment equilibrium assumption is violated.2. We studied the Eurasian otter Lutra lutra, whose numbers are recovering in southern Italy. To produce widely applicable results, we chose standard HSM procedures and looked for the models' capacities in predicting the suitability of a recolonization area. We used two fieldwork datasets: presence-only data, used in the Ecological Niche Factor Analyses (ENFA), and presence-absence data, used in a Generalized Linear Model (GLM). In addition to cross-validation, we independently evaluated the models with data from a recolonization event, providing presences on a previously unoccupied river.3. Three of the models successfully predicted the suitability of the recolonization area, but the GLM built with data before the recolonization disagreed with these predictions, missing the recolonized river's suitability and badly describing the otter's niche. Our results highlighted three points of relevance to modelling practices: (1) absences may prevent the models from correctly identifying areas suitable for a species spread; (2) the selection of variables may lead to randomness in the predictions; and (3) the Area Under Curve (AUC), a commonly used validation index, was not well suited to the evaluation of model quality, whereas the Boyce Index (CBI), based on presence data only, better highlighted the models' fit to the recolonization observations.4. For species with unstable spatial distributions, presence-only models may work better than presence-absence methods in making reliable predictions of suitable areas for expansion. An iterative modelling process, using new occurrences from each step of the species spread, may also help in progressively reducing errors.5. Synthesis and applications. Conservation plans depend on reliable models of the species' suitable habitats. In non-equilibrium situations, such as the case for threatened or invasive species, models could be affected negatively by the inclusion of absence data when predicting the areas of potential expansion. Presence-only methods will here provide a better basis for productive conservation management practices.
Resumo:
We study whether the neutron skin thickness Δrnp of 208Pb originates from the bulk or from the surface of the nucleon density distributions, according to the mean-field models of nuclear structure, and find that it depends on the stiffness of the nuclear symmetry energy. The bulk contribution to Δrnp arises from an extended sharp radius of neutrons, whereas the surface contribution arises from different widths of the neutron and proton surfaces. Nuclear models where the symmetry energy is stiff, as typical of relativistic models, predict a bulk contribution in Δrnp of 208Pb about twice as large as the surface contribution. In contrast, models with a soft symmetry energy like common nonrelativistic models predict that Δrnp of 208Pb is divided similarly into bulk and surface parts. Indeed, if the symmetry energy is supersoft, the surface contribution becomes dominant. We note that the linear correlation of Δrnp of 208Pb with the density derivative of the nuclear symmetry energy arises from the bulk part of Δrnp. We also note that most models predict a mixed-type (between halo and skin) neutron distribution for 208Pb. Although the halo-type limit is actually found in the models with a supersoft symmetry energy, the skin-type limit is not supported by any mean-field model. Finally, we compute parity-violating electron scattering in the conditions of the 208Pb parity radius experiment (PREX) and obtain a pocket formula for the parity-violating asymmetry in terms of the parameters that characterize the shape of the 208Pb nucleon densities.
Resumo:
Understanding the structure of interphase chromosomes is essential to elucidate regulatory mechanisms of gene expression. During recent years, high-throughput DNA sequencing expanded the power of chromosome conformation capture (3C) methods that provide information about reciprocal spatial proximity of chromosomal loci. Since 2012, it is known that entire chromatin in interphase chromosomes is organized into regions with strongly increased frequency of internal contacts. These regions, with the average size of ∼1 Mb, were named topological domains. More recent studies demonstrated presence of unconstrained supercoiling in interphase chromosomes. Using Brownian dynamics simulations, we show here that by including supercoiling into models of topological domains one can reproduce and thus provide possible explanations of several experimentally observed characteristics of interphase chromosomes, such as their complex contact maps.
Liming in Agricultural Production Models with and Without the Adoption of Crop-Livestock Integration
Resumo:
ABSTRACT Perennial forage crops used in crop-livestock integration (CLI) are able to accumulate large amounts of straw on the soil surface in no-tillage system (NTS). In addition, they can potentially produce large amounts of soluble organic compounds that help improving the efficiency of liming in the subsurface, which favors root growth, thus reducing the risks of loss in yield during dry spells and the harmful effects of “overliming”. The aim of this study was to test the effects of liming on two models of agricultural production, with and without crop-livestock integration, for 2 years. Thus, an experiment was conducted in a Latossolo Vermelho (Oxisol) with a very clayey texture located in an agricultural area under the NTS in Bandeirantes, PR, Brazil. Liming was performed to increase base saturation (V) to 65, 75, and 90 % while one plot per block was maintained without the application of lime (control). A randomized block experimental design was adopted arranged in split-plots and four plots/block, with four replications. The soil properties evaluated were: pH in CaCl2, soil organic matter (SOM), Ca, Mg, K, Al, and P. The effects of liming were observed to a greater depth and for a long period through mobilization of ions in the soil, leading to a reduction in SOM and Al concentration and an increase in pH and the levels of Ca and Mg. In the first crop year, adoption of CLI led to an increase in the levels of K and Mg and a reduction in the levels of SOM; however, in the second crop year, the rate of decline of SOM decreased compared to the decline observed in the first crop year, and the level of K increased, whereas that of P decreased. The extent of the effects of liming in terms of depth and improvement in the root environment from the treatments were observed only partially from the changes observed in the chemical properties studied.