935 resultados para Input-output model
Resumo:
The sensitivity of altitudinal and latitudinal tree-line ecotones to climate change, particularly that of temperature, has received much attention. To improve our understanding of the factors affecting tree-line position, we used the spatially explicit dynamic forest model TreeMig. Although well-suited because of its landscape dynamics functions, TreeMig features a parabolic temperature growth response curve, which has recently been questioned. and the species parameters are not specifically calibrated for cold temperatures. Our main goals were to improve the theoretical basis of the temperature growth response curve in the model and develop a method for deriving that curve's parameters from tree-ring data. We replaced the parabola with an asymptotic curve, calibrated for the main species at the subalpine (Swiss Alps: Pinus cembra, Larix decidua, Picea abies) and boreal (Fennoscandia: Pinus sylvestris, Betula pubescens, P. abies) tree-lines. After fitting new parameters, the growth curve matched observed tree-ring widths better. For the subalpine species, the minimum degree-day sum allowing, growth (kDDMin) was lowered by around 100 degree-days; in the case of Larix, the maximum potential ring-width was increased to 5.19 mm. At the boreal tree-line, the kDDMin for P. sylvestris was lowered by 210 degree-days and its maximum ring-width increased to 2.943 mm; for Betula (new in the model) kDDMin was set to 325 degree-days and the maximum ring-width to 2.51 mm; the values from the only boreal sample site for Picea were similar to the subalpine ones, so the same parameters were used. However, adjusting the growth response alone did not improve the model's output concerning species' distributions and their relative importance at tree-line. Minimum winter temperature (MinWiT, mean of the coldest winter month), which controls seedling establishment in TreeMig, proved more important for determining distribution. Picea, P. sylvestris and Betula did not previously have minimum winter temperature limits, so these values were set to the 95th percentile of each species' coldest MinWiT site (respectively -7, -11, -13). In a case study for the Alps, the original and newly calibrated versions of TreeMig were compared with biomass data from the National Forest Inventor), (NFI). Both models gave similar, reasonably realistic results. In conclusion, this method of deriving temperature responses from tree-rings works well. However, regeneration and its underlying factors seem more important for controlling species' distributions than previously thought. More research on regeneration ecology, especially at the upper limit of forests. is needed to improve predictions of tree-line responses to climate change further.
Resumo:
We develop a mediation model in which firm size is proposed to affect the scale and quality of innovative output through the adoption of different decision styles during the R&D process. The aim of this study is to understand how the internal changes that firms undergo as they evolve from small to larger organizations affect R&D productivity. In so doing, we illuminate the underlying theoretical mechanism affecting two different dimensions of R&D productivity, namely the scale and quality of innovative output which have not received much attention in previous literature. Using longitudinal data of Spanish manufacturing firms we explore the validity of this mediation model. Our results show that as firms evolve in size, they increasingly emphasize analytical decision making, and consequently, large-sized firms aim for higher-quality innovations while small firms aim for a larger scale of innovative output.
Resumo:
The purpose of this study was to develop a two-compartment metabolic model of brain metabolism to assess oxidative metabolism from [1-(11)C] acetate radiotracer experiments, using an approach previously applied in (13)C magnetic resonance spectroscopy (MRS), and compared with an one-tissue compartment model previously used in brain [1-(11)C] acetate studies. Compared with (13)C MRS studies, (11)C radiotracer measurements provide a single uptake curve representing the sum of all labeled metabolites, without chemical differentiation, but with higher temporal resolution. The reliability of the adjusted metabolic fluxes was analyzed with Monte-Carlo simulations using synthetic (11)C uptake curves, based on a typical arterial input function and previously published values of the neuroglial fluxes V(tca)(g), V(x), V(nt), and V(tca)(n) measured in dynamic (13)C MRS experiments. Assuming V(x)(g)=10 × V(tca)(g) and V(x)(n)=V(tca)(n), it was possible to assess the composite glial tricarboxylic acid (TCA) cycle flux V(gt)(g) (V(gt)(g)=V(x)(g) × V(tca)(g)/(V(x)(g)+V(tca)(g))) and the neurotransmission flux V(nt) from (11)C tissue-activity curves obtained within 30 minutes in the rat cortex with a beta-probe after a bolus infusion of [1-(11)C] acetate (n=9), resulting in V(gt)(g)=0.136±0.042 and V(nt)=0.170±0.103 μmol/g per minute (mean±s.d. of the group), in good agreement with (13)C MRS measurements.
Resumo:
El projecte es basa en el disseny i la fabricació d’un circuit desfasador, amb una entrada i una sortida, que commuta entre dos estats, un estat amb desfasament de 180º i l’altre sense desfasament. El circuit es dissenya amb microstrip,per una fc=5GHz. S’intenten obtenir les millors característiques per aquest disseny, és a dir, bon ample de banda i pèrdues per inserció baixes, acompanyat d’una bona resposta en les relacions de magnitud i fase. Es segueixen diferents etapes de disseny, on es comença per un model simple i s’avança en complexitat, afegint els nous components que acabaran conformant el circuit final. Després del disseny es passa a la fabricació del circuit, per veure el seu funcionament real. La memòria recull i ordena la informació obtinguda a través d’aquest procés, intentant mostrar-la de manera clara, per tal de seguir el procés de disseny realitzat, i així poder interpretar els resultats obtinguts. L’objectiu final és veure com es comporta el circuit dissenyat i definir les pautes a seguir per millorar-lo en un futur.
Resumo:
We use a dynamic factor model to provide a semi-structural representation for 101 quarterly US macroeconomic series. We find that (i) the US economy is well described by a number of structural shocks between two and six. Focusing on the four-shock specification, we identify, using sign restrictions, two non-policy shocks, demand and supply, and two policy shocks, monetary and fiscal. We obtain the following results. (ii) Both supply and demand shocks are important sources of fluctuations; supply prevails for GDP, while demand prevails for employment and inflation. (ii) Policy matters, Both monetary and fiscal policy shocks have sizeable effects on output and prices, with little evidence of crowding out; both monetary and fiscal authorities implement important systematic countercyclical policies reacting to demand shocks. (iii) Negative demand shocks have a large long-run positive effect on productivity, consistently with the Schumpeterian "cleansing" view of recessions.
Resumo:
Gim & Kim (1998) proposed a generalization of Jeong (1982, 1984) reinterpretation of the Hawkins-Simon condition for macroeconomic stability to off-diagonal matrix elements. This generalization is conceptually relevant for it offers a complementary view of interindustry linkages beyond final or net output influence. The extension is completely similar to the 'total flow' idea introduced by Szyrmer (1992) or the 'output-to-output' multiplier of Miller & Blair (2009). However the practical implementation of Gim & Kim is actually faulty since it confuses the appropriate order of output normalization. We provide a new and elementary solution for the correct formalization using standard interindustry accounting concepts.
Resumo:
BACKGROUND: Half of the patients with end-stage heart failure suffer from persistent atrial fibrillation (AF). Atrial kick (AK) accounts for 10-15% of the ejection fraction. A device restoring AK should significantly improve cardiac output (CO) and possibly delay ventricular assist device (VAD) implantation. This study has been designed to assess the mechanical effects of a motorless pump on the right chambers of the heart in an animal model. METHODS: Atripump is a dome-shaped biometal actuator electrically driven by a pacemaker-like control unit. In eight sheep, the device was sutured onto the right atrium (RA). AF was simulated with rapid atrial pacing. RA ejection fraction (EF) was assessed with intracardiac ultrasound (ICUS) in baseline, AF and assisted-AF status. In two animals, the pump was left in place for 4 weeks and then explanted. Histology examination was carried out. The mean values for single measurement per animal with +/-SD were analysed. RESULTS: The contraction rate of the device was 60 per min. RA EF was 41% in baseline, 7% in AF and 21% in assisted-AF conditions. CO was 7+/-0.5 l min(-1) in baseline, 6.2+/-0.5 l min(-1) in AF and 6.7+/-0.5 l min(-1) in assisted-AF status (p<0.01). Histology of the atrium in the chronic group showed chronic tissue inflammation and no sign of tissue necrosis. CONCLUSIONS: The artificial muscle restores the AK and improves CO. In patients with end-stage cardiac failure and permanent AF, if implanted on both sides, it would improve CO and possibly delay or even avoid complex surgical treatment such as VAD implantation.
Resumo:
L'objectiu d'aquest TFC consisteix a desenvolupar i implementar l'eina de visualització molecular opengl: HVM. Aquesta aplicació, que permet la visualització i la inspecció de molècules, és de gran utilitat en àrees com la química, la farmàcia, la docència, etc., i admet definicions de molècules mitjançant un fitxer d'entrada (una variació simplificada del format XMOL XYZ), construint-ne el model, cosa que afavoreix que s'hi pugui navegar, com també la selecció i la identificació dels seus elements i el càlcul de distàncies i angles de torsió entre ells. A més, permet la definició d'un eix sobre el qual es pot generar una rotació del model i gravar una seqüència de sortida.
Resumo:
Uncertainty quantification of petroleum reservoir models is one of the present challenges, which is usually approached with a wide range of geostatistical tools linked with statistical optimisation or/and inference algorithms. Recent advances in machine learning offer a novel approach to model spatial distribution of petrophysical properties in complex reservoirs alternative to geostatistics. The approach is based of semisupervised learning, which handles both ?labelled? observed data and ?unlabelled? data, which have no measured value but describe prior knowledge and other relevant data in forms of manifolds in the input space where the modelled property is continuous. Proposed semi-supervised Support Vector Regression (SVR) model has demonstrated its capability to represent realistic geological features and describe stochastic variability and non-uniqueness of spatial properties. On the other hand, it is able to capture and preserve key spatial dependencies such as connectivity of high permeability geo-bodies, which is often difficult in contemporary petroleum reservoir studies. Semi-supervised SVR as a data driven algorithm is designed to integrate various kind of conditioning information and learn dependences from it. The semi-supervised SVR model is able to balance signal/noise levels and control the prior belief in available data. In this work, stochastic semi-supervised SVR geomodel is integrated into Bayesian framework to quantify uncertainty of reservoir production with multiple models fitted to past dynamic observations (production history). Multiple history matched models are obtained using stochastic sampling and/or MCMC-based inference algorithms, which evaluate posterior probability distribution. Uncertainty of the model is described by posterior probability of the model parameters that represent key geological properties: spatial correlation size, continuity strength, smoothness/variability of spatial property distribution. The developed approach is illustrated with a fluvial reservoir case. The resulting probabilistic production forecasts are described by uncertainty envelopes. The paper compares the performance of the models with different combinations of unknown parameters and discusses sensitivity issues.
Resumo:
We study the minimum mean square error (MMSE) and the multiuser efficiency η of large dynamic multiple access communication systems in which optimal multiuser detection is performed at the receiver as the number and the identities of active users is allowed to change at each transmission time. The system dynamics are ruled by a Markov model describing the evolution of the channel occupancy and a large-system analysis is performed when the number of observations grow large. Starting on the equivalent scalar channel and the fixed-point equation tying multiuser efficiency and MMSE, we extend it to the case of a dynamic channel, and derive lower and upper bounds for the MMSE (and, thus, for η as well) holding true in the limit of large signal–to–noise ratios and increasingly large observation time T.
Resumo:
We develop a model of an industry with many heterogeneous firms that face both financing constraints and irreversibility constraints. The financing constraint implies that firms cannot borrow unless the debt is secured by collateral; the irreversibility constraint that they can only sell their fixed capital by selling their business. We use this model to examine the cyclical behavior of aggregate fixed investment, variable capital investment, and output in the presence of persistent idiosyncratic and aggregate shocks. Our model yields three main results. First, the effect of the irreversibility constraint on fixed capital investment is reinforced by the financing constraint. Second, the effect of the financing constraint on variable capital investment is reinforced by the irreversibility constraint. Finally, the interaction between the two constraints is key for explaining why input inventories and material deliveries of US manufacturing firms are so volatile and procyclical, and also why they are highly asymmetrical over the business cycle.
Resumo:
In this paper we study the relevance of multiple kernel learning (MKL) for the automatic selection of time series inputs. Recently, MKL has gained great attention in the machine learning community due to its flexibility in modelling complex patterns and performing feature selection. In general, MKL constructs the kernel as a weighted linear combination of basis kernels, exploiting different sources of information. An efficient algorithm wrapping a Support Vector Regression model for optimizing the MKL weights, named SimpleMKL, is used for the analysis. In this sense, MKL performs feature selection by discarding inputs/kernels with low or null weights. The approach proposed is tested with simulated linear and nonlinear time series (AutoRegressive, Henon and Lorenz series).
Resumo:
This paper shows that an open economy Solow model provides a good description of international investment positions in industrialized countries. More than half of the variation of net foreign assets in the 1990's can be attributed to cross country differences in the savings rate, population and productivity growth. Furthermore, these factors seem to be an important channel through which output and wealth affect international investment positions. We interpret this funding as evidence that decreasing returns are an important source of international capital movements. The savings rate (andpopulation growth) influence the composition of country portfolios through their downward (upward) pressure on the marginal productivity of capital.
Resumo:
We reformulate the Smets-Wouters (2007) framework by embedding the theory of unemployment proposed in Galí (2011a,b). Weestimate the resulting model using postwar U.S. data, while treatingthe unemployment rate as an additional observable variable. Our approach overcomes the lack of identification of wage markup and laborsupply shocks highlighted by Chari, Kehoe and McGrattan (2008) intheir criticism of New Keynesian models, and allows us to estimate a"correct" measure of the output gap. In addition, the estimated modelcan be used to analyze the sources of unemployment fluctuations.
Resumo:
Most central banks perceive a trade-off between stabilizing inflation and stabilizing the gap between output and desired output. However, the standard new Keynesian framework implies no such trade-off. In that framework, stabilizing inflation is equivalent to stabilizing the welfare-relevant output gap. In this paper, we argue that this property of the new Keynesian framework, which we call the divine coincidence, is due to a special feature of the model: the absence of non trivial real imperfections.We focus on one such real imperfection, namely, real wage rigidities. When the baseline new Keynesian model is extended to allow for real wage rigidities, the divine coincidence disappears, and central banks indeed face a trade-off between stabilizing inflation and stabilizing the welfare-relevant output gap. We show that not only does the extended model have more realistic normative implications, but it also has appealing positive properties. In particular, it provides a natural interpretation for the dynamic inflation-unemployment relation found in the data.