871 resultados para Dynamic Model Averaging
Resumo:
We develop methods for Bayesian model averaging (BMA) or selection (BMS) in Panel Vector Autoregressions (PVARs). Our approach allows us to select between or average over all possible combinations of restricted PVARs where the restrictions involve interdependencies between and heterogeneities across cross-sectional units. The resulting BMA framework can find a parsimonious PVAR specification, thus dealing with overparameterization concerns. We use these methods in an application involving the euro area sovereign debt crisis and show that our methods perform better than alternatives. Our findings contradict a simple view of the sovereign debt crisis which divides the euro zone into groups of core and peripheral countries and worries about financial contagion within the latter group.
Resumo:
We develop methods for Bayesian model averaging (BMA) or selection (BMS) in Panel Vector Autoregressions (PVARs). Our approach allows us to select between or average over all possible combinations of restricted PVARs where the restrictions involve interdependencies between and heterogeneities across cross-sectional units. The resulting BMA framework can find a parsimonious PVAR specification, thus dealing with overparameterization concerns. We use these methods in an application involving the euro area sovereign debt crisis and show that our methods perform better than alternatives. Our findings contradict a simple view of the sovereign debt crisis which divides the euro zone into groups of core and peripheral countries and worries about financial contagion within the latter group.
Resumo:
Bayesian model averaging (BMA) methods are regularly used to deal with model uncertainty in regression models. This paper shows how to introduce Bayesian model averaging methods in quantile regressions, and allow for different predictors to affect different quantiles of the dependent variable. I show that quantile regression BMA methods can help reduce uncertainty regarding outcomes of future inflation by providing superior predictive densities compared to mean regression models with and without BMA.
Resumo:
Hepatitis C virus (HCV) NS3-4A is a membrane-associated multifunctional protein harboring serine protease and RNA helicase activities. It is an essential component of the HCV replication complex and a prime target for antiviral intervention. Here, we show that membrane association and structural organization of HCV NS3-4A are ensured in a cooperative manner by two membrane-binding determinants. We demonstrate that the N-terminal 21 amino acids of NS4A form a transmembrane alpha-helix that may be involved in intramembrane protein-protein interactions important for the assembly of a functional replication complex. In addition, we demonstrate that amphipathic helix alpha(0), formed by NS3 residues 12-23, serves as a second essential determinant for membrane association of NS3-4A, allowing proper positioning of the serine protease active site on the membrane. These results allowed us to propose a dynamic model for the membrane association, processing, and structural organization of NS3-4A on the membrane. This model has implications for the functional architecture of the HCV replication complex, proteolytic targeting of host factors, and drug design.
Resumo:
Theories on social capital and on social entrepreneurship have mainly highlighted the attitude of social capital to generate enterprises and to foster good relations between third sector organizations and the public sector. This paper considers the social capital in a specific third sector enterprise; here, multi-stakeholder social cooperatives are seen, at the same time, as social capital results, creators and incubators. In the particular enterprises that identify themselves as community social enterprises, social capital, both as organizational and relational capital, is fundamental: SCEs arise from but also produce and disseminate social capital. This paper aims to improve the building of relational social capital and the refining of helpful relations drawn from other arenas, where they were created and from where they are sometimes transferred to other realities, where their role is carried on further (often working in non-profit, horizontally and vertically arranged groups, where they share resources and relations). To represent this perspective, we use a qualitative system dynamic approach in which social capital is measured using proxies. Cooperation of volunteers, customers, community leaders and third sector local organizations is fundamental to establish trust relations between public local authorities and cooperatives. These relations help the latter to maintain long-term contracts with local authorities as providers of social services and enable them to add innovation to their services, by developing experiences and management models and maintaining an interchange with civil servants regarding these matters. The long-term relations and the organizational relations linking SCEs and public organizations help to create and to renovate social capital. Thus, multi-stakeholder cooperatives originated via social capital developed in third sector organizations produce new social capital within the cooperatives themselves and between different cooperatives (entrepreneurial components of the third sector) and the public sector. In their entrepreneurial life, cooperatives have to contrast the "working drift," as a result of which only workers remain as members of the cooperative, while other stakeholders leave the organization. Those who are not workers in the cooperative are (stake)holders with "weak ties," who are nevertheless fundamental in making a worker's cooperative an authentic social multi-stakeholders cooperative. To maintain multi-stakeholder governance and the relations with third sector and civil society, social cooperatives have to reinforce participation and dialogue with civil society through ongoing efforts to include people that provide social proposals. We try to represent these processes in a system dynamic model applied to local cooperatives, measuring the social capital created by the social cooperative through proxies, such as number of volunteers and strong cooperation with public institutions. Using a reverse-engineering approach, we can individuate the determinants of the creation of social capital and thereby give support to governance that creates social capital.
Resumo:
Nessie is an Autonomous Underwater Vehicle (AUV) created by a team of students in the Heriot Watt University to compete in the Student Autonomous Underwater Competition, Europe (SAUC-E) in August 2006. The main objective of the project is to find the dynamic equation of the robot, dynamic model. With it, the behaviour of the robot will be easier to understand and movement tests will be available by computer without the need of the robot, what is a way to save time, batteries, money and the robot from water inside itself. The object of the second part in this project is setting a control system for Nessie by using the model
Resumo:
This paper presents and estimates a dynamic choice model in the attribute space considering rational consumers. In light of the evidence of several state-dependence patterns, the standard attribute-based model is extended by considering a general utility function where pure inertia and pure variety-seeking behaviors can be explained in the model as particular linear cases. The dynamics of the model are fully characterized by standard dynamic programming techniques. The model presents a stationary consumption pattern that can be inertial, where the consumer only buys one product, or a variety-seeking one, where the consumer shifts among varied products.We run some simulations to analyze the consumption paths out of the steady state. Underthe hybrid utility assumption, the consumer behaves inertially among the unfamiliar brandsfor several periods, eventually switching to a variety-seeking behavior when the stationary levels are approached. An empirical analysis is run using scanner databases for three different product categories: fabric softener, saltine cracker, and catsup. Non-linear specifications provide the best fit of the data, as hybrid functional forms are found in all the product categories for most attributes and segments. These results reveal the statistical superiority of the non-linear structure and confirm the gradual trend to seek variety as the level of familiarity with the purchased items increases.
Resumo:
Quality of life has been extensively discussed in acute and chronic illnesses. However a dynamic model grounded in the experience of patients in the course of transplantation has not been to our knowledge developed. In a qualitative longitudinal study, patients awaiting solid organ transplantation participated in semi-structured interviews: Exploring topics pre-selected on previous research literature review. Creative interview was privileged, open to themes patients would like to discuss at the different steps of the transplantation process. A qualitative thematic and reflexive analysis was performed, and a model of the dimensions constitutive of quality of life from the perspective of the patients was elaborated. Quality of life is not a stable construct in a long lasting illness-course, but evolves with illness constraints, treatments and outcomes. Dimensions constitutive of quality of life are defined, each of them containing different sub-categories depending on the organ related illness co-morbidities and the stage of illness-course.
Resumo:
The main objective of this work is to analyze the importance of the gas-solid interface transfer of the kinetic energy of the turbulent motion on the accuracy of prediction of the fluid dynamic of Circulating Fluidized Bed (CFB) reactors. CFB reactors are used in a variety of industrial applications related to combustion, incineration and catalytic cracking. In this work a two-dimensional fluid dynamic model for gas-particle flow has been used to compute the porosity, the pressure, and the velocity fields of both phases in 2-D axisymmetrical cylindrical co-ordinates. The fluid dynamic model is based on the two fluid model approach in which both phases are considered to be continuous and fully interpenetrating. CFB processes are essentially turbulent. The model of effective stress on each phase is that of a Newtonian fluid, where the effective gas viscosity was calculated from the standard k-epsilon turbulence model and the transport coefficients of the particulate phase were calculated from the kinetic theory of granular flow (KTGF). This work shows that the turbulence transfer between the phases is very important for a better representation of the fluid dynamics of CFB reactors, especially for systems with internal recirculation and high gradients of particle concentration. Two systems with different characteristics were analyzed. The results were compared with experimental data available in the literature. The results were obtained by using a computer code developed by the authors. The finite volume method with collocated grid, the hybrid interpolation scheme, the false time step strategy and SIMPLEC (Semi-Implicit Method for Pressure Linked Equations - Consistent) algorithm were used to obtain the numerical solution.
Resumo:
This thesis presents a one-dimensional, semi-empirical dynamic model for the simulation and analysis of a calcium looping process for post-combustion CO2 capture. Reduction of greenhouse emissions from fossil fuel power production requires rapid actions including the development of efficient carbon capture and sequestration technologies. The development of new carbon capture technologies can be expedited by using modelling tools. Techno-economical evaluation of new capture processes can be done quickly and cost-effectively with computational models before building expensive pilot plants. Post-combustion calcium looping is a developing carbon capture process which utilizes fluidized bed technology with lime as a sorbent. The main objective of this work was to analyse the technological feasibility of the calcium looping process at different scales with a computational model. A one-dimensional dynamic model was applied to the calcium looping process, simulating the behaviour of the interconnected circulating fluidized bed reactors. The model incorporates fundamental mass and energy balance solvers to semi-empirical models describing solid behaviour in a circulating fluidized bed and chemical reactions occurring in the calcium loop. In addition, fluidized bed combustion, heat transfer and core-wall layer effects were modelled. The calcium looping model framework was successfully applied to a 30 kWth laboratory scale and a pilot scale unit 1.7 MWth and used to design a conceptual 250 MWth industrial scale unit. Valuable information was gathered from the behaviour of a small scale laboratory device. In addition, the interconnected behaviour of pilot plant reactors and the effect of solid fluidization on the thermal and carbon dioxide balances of the system were analysed. The scale-up study provided practical information on the thermal design of an industrial sized unit, selection of particle size and operability in different load scenarios.
Resumo:
Muchas de las nuevas aplicaciones emergentes de Internet tales como TV sobre Internet, Radio sobre Internet,Video Streamming multi-punto, entre otras, necesitan los siguientes requerimientos de recursos: ancho de banda consumido, retardo extremo-a-extremo, tasa de paquetes perdidos, etc. Por lo anterior, es necesario formular una propuesta que especifique y provea para este tipo de aplicaciones los recursos necesarios para su buen funcionamiento. En esta tesis, proponemos un esquema de ingeniería de tráfico multi-objetivo a través del uso de diferentes árboles de distribución para muchos flujos multicast. En este caso, estamos usando la aproximación de múltiples caminos para cada nodo egreso y de esta forma obtener la aproximación de múltiples árboles y a través de esta forma crear diferentes árboles multicast. Sin embargo, nuestra propuesta resuelve la fracción de la división del tráfico a través de múltiples árboles. La propuesta puede ser aplicada en redes MPLS estableciendo rutas explícitas en eventos multicast. En primera instancia, el objetivo es combinar los siguientes objetivos ponderados dentro de una métrica agregada: máxima utilización de los enlaces, cantidad de saltos, el ancho de banda total consumido y el retardo total extremo-a-extremo. Nosotros hemos formulado esta función multi-objetivo (modelo MHDB-S) y los resultados obtenidos muestran que varios objetivos ponderados son reducidos y la máxima utilización de los enlaces es minimizada. El problema es NP-duro, por lo tanto, un algoritmo es propuesto para optimizar los diferentes objetivos. El comportamiento que obtuvimos usando este algoritmo es similar al que obtuvimos con el modelo. Normalmente, durante la transmisión multicast los nodos egresos pueden salir o entrar del árbol y por esta razón en esta tesis proponemos un esquema de ingeniería de tráfico multi-objetivo usando diferentes árboles para grupos multicast dinámicos. (en el cual los nodos egresos pueden cambiar durante el tiempo de vida de la conexión). Si un árbol multicast es recomputado desde el principio, esto podría consumir un tiempo considerable de CPU y además todas las comuicaciones que están usando el árbol multicast serán temporalmente interrumpida. Para aliviar estos inconvenientes, proponemos un modelo de optimización (modelo dinámico MHDB-D) que utilice los árboles multicast previamente computados (modelo estático MHDB-S) adicionando nuevos nodos egreso. Usando el método de la suma ponderada para resolver el modelo analítico, no necesariamente es correcto, porque es posible tener un espacio de solución no convexo y por esta razón algunas soluciones pueden no ser encontradas. Adicionalmente, otros tipos de objetivos fueron encontrados en diferentes trabajos de investigación. Por las razones mencionadas anteriormente, un nuevo modelo llamado GMM es propuesto y para dar solución a este problema un nuevo algoritmo usando Algoritmos Evolutivos Multi-Objetivos es propuesto. Este algoritmo esta inspirado por el algoritmo Strength Pareto Evolutionary Algorithm (SPEA). Para dar una solución al caso dinámico con este modelo generalizado, nosotros hemos propuesto un nuevo modelo dinámico y una solución computacional usando Breadth First Search (BFS) probabilístico. Finalmente, para evaluar nuestro esquema de optimización propuesto, ejecutamos diferentes pruebas y simulaciones. Las principales contribuciones de esta tesis son la taxonomía, los modelos de optimización multi-objetivo para los casos estático y dinámico en transmisiones multicast (MHDB-S y MHDB-D), los algoritmos para dar solución computacional a los modelos. Finalmente, los modelos generalizados también para los casos estático y dinámico (GMM y GMM Dinámico) y las propuestas computacionales para dar slución usando MOEA y BFS probabilístico.
Resumo:
A new dynamic model of water quality, Q(2), has recently been developed, capable of simulating large branched river systems. This paper describes the application of a generalized sensitivity analysis (GSA) to Q(2) for single reaches of the River Thames in southern England. Focusing on the simulation of dissolved oxygen (DO) (since this may be regarded as a proxy for the overall health of a river); the GSA is used to identify key parameters controlling model behavior and provide a probabilistic procedure for model calibration. It is shown that, in the River Thames at least, it is more important to obtain high quality forcing functions than to obtain improved parameter estimates once approximate values have been estimated. Furthermore, there is a need to ensure reasonable simulation of a range of water quality determinands, since a focus only on DO increases predictive uncertainty in the DO simulations. The Q(2) model has been applied here to the River Thames, but it has a broad utility for evaluating other systems in Europe and around the world.
Resumo:
The objective of this work was to construct a dynamic model of hepatic amino acid metabolism in the lactating dairy cow that could be parameterized using net flow data from in vivo experiments. The model considers 22 amino acids, ammonia, urea, and 13 energetic metabolites, and was parameterized using a steady-state balance model and two in vivo, net flow experiments conducted with mid-lactation dairy cows. Extracellular flows were derived directly from the observed data. An optimization routine was used to derive nine intracellular flows. The resulting dynamic model was found to be stable across a range of inputs suggesting that it can be perturbed and applied to other physiological states. Although nitrogen was generally in balance, leucine was in slight deficit compared to predicted needs for export protein synthesis, suggesting that an alternative source of leucine (e.g. peptides) was utilized. Simulations of varying glucagon concentrations indicated that an additional 5 mol/d of glucose could be synthesized at the reference substrate concentrations and blood flows. The increased glucose production was supported by increased removal from blood of lactate, glutamate, aspartate, alanine, asparagine, and glutamine. As glucose Output increased, ketone body and acetate release increased while CO2 release declined. The pattern of amino acids appearing in hepatic vein blood was affected by changes in amino acid concentration in portal vein blood, portal blood flow rate and glucagon concentration, with methionine and phenylalanine being the most affected of essential amino acids. Experimental evidence is insufficient to determine whether essential amino acids are affected by varying gluconeogenic demands. (C) 2004 Published by Elsevier Ltd.
Resumo:
In the past decade, a number of mechanistic, dynamic simulation models of several components of the dairy production system have become available. However their use has been limited due to the detailed technical knowledge and special software required to run them, and the lack of compatibility between models in predicting various metabolic processes in the animal. The first objective of the current study was to integrate the dynamic models of [Brit. J. Nutr. 72 (1994) 679] on rumen function, [J. Anim. Sci. 79 (2001) 1584] on methane production, [J. Anim. Sci. 80 (2002) 2481 on N partition, and a new model of P partition. The second objective was to construct a decision support system to analyse nutrient partition between animal and environment. The integrated model combines key environmental pollutants such as N, P and methane within a nutrient-based feed evaluation system. The model was run under different scenarios and the sensitivity of various parameters analysed. A comparison of predictions from the integrated model with the original simulation models showed an improvement in N excretion since the integrated model uses the dynamic model of [Brit. J. Nutr. 72 (1994) 6791 to predict microbial N, which was not represented in detail in the original model. The integrated model can be used to investigate the degree to which production and environmental objectives are antagonistic, and it may help to explain and understand the complex mechanisms involved at the ruminal and metabolic levels. A part of the integrated model outputs were the forms of N and P in excreta and methane, which can be used as indices of environmental pollution. (C) 2004 Elsevier B.V All rights reserved.
Resumo:
Steady state and dynamic models have been developed and applied to the River Kennet system. Annual nitrogen exports from the land surface to the river have been estimated based on land use from the 1930s and the 1990s. Long term modelled trends indicate that there has been a large increase in nitrogen transport into the river system driven by increased fertiliser application associated with increased cereal production, increased population and increased livestock levels. The dynamic model INCA Integrated Nitrogen in Catchments. has been applied to simulate the day-to-day transport of N from the terrestrial ecosystem to the riverine environment. This process-based model generates spatial and temporal data and reproduces the observed instream concentrations. Applying the model to current land use and 1930s land use indicates that there has been a major shift in the short term dynamics since the 1930s, with increased river and groundwater concentrations caused by both non-point source pollution from agriculture and point source discharges. �