948 resultados para Random Coefficient Autoregressive Model{ RCAR (1)}
Resumo:
Japanese ODA, especially that undertaken by JICA, has targeted South Sulawesi Province as a core area of development in eastern Indonesia, with hope that the economic growth of South Sulawesi will bring about spillover effects in other regions. This paper tests the validity of the strategy using a framework of Vector Autoregressive model. The results show that South Sulawesi’s economy Granger causes other regions in eastern Indonesia, but not vice versa, implying that South Sulawesi drives the development of other regions in eastern Indonesia. Further analysis shows that the development of the agricultural sector in South Sulawesi potentially has the highest spillover effects than other sectors and that the magnitude of spillover effect from South Sulawesi on eastern Indonesia is higher than other economically important regions, such as Eastern Java and Kalimantan.
Resumo:
Foreign firms have clustered together in the Yangtze River Delta, and their impact on domestic firms is an important policy issue. This paper studies the spatial effect of FDI agglomeration on the regional productivity of domestic firms, using Chinese firm-level data. To identify local FDI spillovers, we estimate the causal impact of foreign firms on domestic firms in the same county and similar industries. We then estimate a spatial-autoregressive model to examine spatial spillovers from FDI clusters to other domestic firms in distant counties. Our results show that FDI agglomeration generates positive spillovers for domestic firms, which are stronger in nearby areas than in distant areas.
Resumo:
The presence of a large informal sector in developing economies poses the question of whether informal activity produces agglomeration externalities. This paper uses data on all the nonfarm establishments and enterprises in Cambodia to estimate the impact of informal agglomeration on the regional economic performance of formal and informal firms. We develop a Bayesian approach for a spatial autoregressive model with an endogenous explanatory variable to address endogeneity and spatial dependence. We find a significantly positive effect of informal agglomeration, where informal firms gain more strongly than formal firms. Calculating the spatial marginal effects of increased agglomeration, we demonstrate that more accessible regions are more likely than less accessible regions to benefit strongly from informal agglomeration.
Resumo:
Este estudio aborda la recopilación de nuevas tendencias del diseño sismorresistente, enfocándose en la técnica del aislamiento de base, por ser la más efectiva, difundida y utilizada; y el análisis de las ventajas que puede tener una edificación que aplica dicha técnica, desde el punto de vista estructural y económico. Se elige la tipología más frecuente o común de edificios de hormigón armado propensos a ser aislados, que en este caso es un hospital, cuyo modelo empotrado se somete a varias normas sismorresistentes comparando principalmente fuerzas de cortante basal, y considerando la interacción suelo-estructura; para asistir a este cálculo se desarrolla un programa de elementos viga de 6 gdl por nodo en código Matlab. El modelo aislado incluye el análisis de tres combinaciones de tipos de aisladores HDR, LPR y FPS, alternando modelos lineales simplificados de 1 y 3 gdl por piso, evaluando diferencias de respuestas de la estructura, y procediendo a la elección de la combinación que de resultados más convenientes; para la modelación no lineal de cada sistema de aislamiento se utiliza el método explícito de diferencias centrales. Finalmente, se realiza un análisis comparativo de daños esperados en el caso de la ocurrencia del sismo de diseño, utilizando el método rápido y tomando como referencia el desplazamiento espectral del último piso; llegando a dar conclusiones y recomendaciones para el uso de sistemas de aislamiento. This study addresses the collection of new seismic design trends, focusing on base isolation technique, as the most effective and widely used, and the analysis of the advantages in buildings that apply this technique, from the structurally and economically point of view. Choosing the most common types of concrete buildings likely to be isolated, which in this case is a hospital, the fix model is subjected to various seismic codes mainly comparing base shear forces, and considering the soil-structure interaction; for this calculation attend a program of bars 6 dof per node is made in Matlab code. The isolated model includes analysis of three types of isolators combinations HDR, LPR and FPS, alternating simplified linear model of 1 and 3 dof per floor, evaluating differences in the response of the structure, and proceeding to the choice of the combination of results more convenient; for modeling nonlinear each insulation system, the explicit central difference method is used. Finally, a comparative analysis of expected damage in the case of the design earthquake, using a fast combined method and by reference to the spectral displacement of the top floor; reaching conclusions and give recommendations for the use of insulation systems.
Resumo:
El fútbol es un deporte en cuya práctica existe una alta incidencia de lesión. Además, en el ámbito profesional las lesiones suponen un duro proceso de recuperación para el futbolista, una reducción del rendimiento deportivo para éste y para el equipo, y unos grandes costes económicos para el club. Dentro de esta problemática, la bibliografía consultada concluye que en las pretemporadas se produce una mayor incidencia de lesión por sobrecarga, es decir, sin contacto; un tipo de lesiones que está a nuestro alcance poder prevenir. Por ello, consideramos importante el conocer y desarrollar métodos, herramientas y principios para obtener programas preventivos efectivos, que reduzcan las elevadas cifras de incidencia de lesión mostradas por la literatura. El presente estudio observa y registra las lesiones, a través del cuestionario F-MARC, de un equipo profesional de fútbol de la Liga Española durante las pretemporadas 2008 (n=24) y 2009 (n=24). Además, durante la pretemporada 2009 se aplicó la termografía infrarroja para adquirir información sobre la asimilación de la carga de entrenamiento por parte de los jugadores, y dicha información se utilizarón para mejorar las tomas de decisiones de protocolos post-ejercicio específicos en prevención de lesiones, los cuales fueron los mismos que se utilizaron previamente en la pretemporada 2008. El estudio tiene un diseño con características pre-post sin grupo de control. Es un estudio longitudinal donde, tras un registro inicial de lesiones en la pretemporada 2008, los sujetos fueron expuestos a la variable independiente, utilización de la termografía infrarroja, en el protocolo de prevención de lesiones durante la pretemporada 2009. Los resultados de este trabajo muestran una reducción significativa de hasta el 60% en la frecuencia de lesión durante la pretemporada 2009, y un descenso de la incidencia de lesión total que pasa de 8,3 lesiones por cada 1000 horas de exposición en 2008 a 3,4 en 2009. Con ello, la probabilidad de lesión se redujo considerablemente pasando de un 85% de los jugadores lesionados en 2008 a un 26% en 2009; además, las lesiones de carácter muscular descendieron en 2009 un 70% con respecto al 2008, y los días de baja que causaron todas las lesiones pasó a reducirse un 91,8% en la pretemporada 2009. Por otro lado, el perfil térmico de los jugadores en función de su lateralidad y dominancia, guarda una gran similitud en sus temperaturas, tanto medias como máximas, con temperaturas más elevadas en la zona corporal lumbar y poplítea, y con temperaturas más bajas en tobillos y rodillas. Todas las zonas corporales estudiadas, exceptuando el tobillo (p<0,05), no presentan diferencias significativas entre ambos hemicuerpos, estableciendo un promedio de diferencia entre ambos lados de 0,06±0,16 ºC. Teniendo en cuenta estos resultados, consideramos el límite superior de normalidad de asimetría térmica bilateral en 0,3ºC para todas las zonas corporales estudiadas del miembro inferior exceptuando los tobillos. El parámetro ambiental que más relación tiene con la temperatura registrada por la cámara termográfica es la temperatura de la sala con un coeficiente de correlación cercano a r=1,00, seguido de la presión atmosférica con un coeficiente de correlación cercano r=0,50, y, por último, la humedad que no guarda ningún tipo de relación con la temperatura registrada en cada zona corporal en el rango de valores considerados en el estudio. Por otro lado, los resultados del ANOVA de un factor nos indican que existen diferencias de medias entre los tres grupos formados de temperatura ambiente de sala (1º=18º-21ºC, 2º=22º-24ºC y 3º=25º-31ºC). Además, los resultados de la prueba HSD de Tukey nos indican que existen diferencias entre cada uno de los grupos en todas las zonas corporales estudiadas exceptuando los tobillos. Por último, se propone la ecuación; TC-estándar = TC-real – [0,184 * (TS – 21ºC)] para predecir la influencia de la temperatura ambiente sobre la temperatura registrada por la cámara termográfica. Para concluir, tras los resultados obtenidos, podemos afirmar que la aplicación de un protocolo post-ejercicio de prevención de lesiones basado en la información adquirida a través de valoraciones con termografía infrarroja reduce la incidencia de lesión en el grupo de futbolistas profesionales estudiado. Tenemos que ser conscientes que nos encontramos ante un estudio de campo, donde existen muchos factores que han podido influenciar en los resultados y que son difíciles de controlar. Por lo tanto, debemos ser cautos y concluir que la información adquirida de las evaluaciones termográficas ha sido uno de los aspectos que ayudó a la reducción significativa de la incidencia de lesión en la pretemporada 2009 en el equipo de fútbol profesional español de este estudio, pero que seguramente hayan podido existir otros factores que también hayan favorecido este hecho. ABSTRACT Soccer is a sport with a high incidence of injury. Moreover, in professional soccer injuries lead to a tough recovery process for the players, a reduction athletic performance for them and for their teams, and large economic costs for the club. In this issue, the literature concludes that in the preseason a greater incidence of overuse injury occurs (ie, without contact), and this is a type of injury that we can prevent. Therefore, we consider that it is important to know and develop methods, tools and principles to obtain effective preventive programs that reduce the high injury incidence figures shown in the literature. This study observed and recorded injuries, through the questionnaire F-MARC, from a professional soccer team in the Spanish league during the preseason 2008 (n = 24) and 2009 (n = 24). Moreover, during the 2009 preseason infrared thermography was applied to acquire information about the assimilation of the training load by the players, and this information was used to improve the decision making in the specific post-exercise injury prevention protocols, which were the same used in the previous season 2008. The study had a pre-post design without control group. Is a longitudinal study where, after an initial registration of injuries in the 2008 preseason, subjects were exposed to the independent variable, using infrared thermography, included in the protocol for injury prevention during the 2009 preseason. The results of this study show a significant reduction of up to 60% in the frequency of injury during the 2009 preseason, and a decrease in total injury incidence passing from 8.3 injuries per 1000 hours of exposure in 2008 to 3.4 in 2008. With this, the likelihood of injury decreased significantly from 85% of the players injuried in 2008 to 26% in 2009, also muscle injuries in 2009 fell 70% compared to 2008, and sick leave days that caused by all the injuries happened were reduced a 91.8% in the 2009 preseason. On the other hand, the thermal profile of the players according to their laterality and dominance, is quiet similar in their temperatures, both average and maximum values, with an estimated average of the highest temperatures in the lower back and popliteal areas in the back areas, and lower temperatures in the ankles and knees. All body areas studied, except for the ankle (p <0.05), had no significant differences between both sides of the body, establishing an average difference between both sides of 0.06 ± 0.16 °C. Given these results, we consider the upper limit of normal bilateral thermal asymmetry 0.3 °C for all body areas studied in the lower limb except for ankles. The environmental parameter higher related with temperature recorded by the camera is the temperature of the room with a correlation coefficient close to r = 1.00, followed by atmospheric pressure with a correlation coefficient near r = 0.50, and finally, the humidity that, in the range of values considered in the study, it is not related with temperature in each body area. On the other hand, the results of one-way ANOVA indicate that there are differences between the three temperature of the room groups (1 ° = 18-21 °C, 2 º = 22-24 ºC and 3 º = 25-31 ºC). Moreover, the results of the Tukey HSD test indicate that there are differences between each of the groups for all body areas studied except the ankles. Finally, we propose the equation TC-standard = TC-real – [0,184 * (TS – 21ºC)] to predict the influence of temperature on the temperature recorded by the thermographic camera. In conclusion, we can say from our results that the implementation of a post-exercise injury prevention protocol based on information from assessments with infrared thermography reduces the incidence of injury in professional soccer players. We must be aware that we are facing a field study, where there are many factors that could influence the results and they are difficult to control. Therefore, we must be cautious and conclude that the information acquired from the thermographic evaluation has been one of the aspects that helped to significantly reduce the incidence of injury in the preseason 2009 in the Spanish professional football team, but probably they could exist other factors with a positive effect on the reduction of the injury rates.
Resumo:
Wind farms have been extensively simulated through engineering models for the estimation of wind speed and power deficits inside wind farms. These models were designed initially for a few wind turbines located in flat terrain. Other models based on the parabolic approximation of Navier Stokes equations were developed, making more realistic and feasible the operational resolution of big wind farms in flat terrain and offshore sites. These models have demonstrated to be accurate enough when solving wake effects for this type of environments. Nevertheless, few analyses exist on how complex terrain can affect the behaviour of wind farm wake flow. Recent numerical studies have demonstrated that topographical wakes induce a significant effect on wind turbines wakes, compared to that on flat terrain. This circumstance has recommended the development of elliptic CFD models which allow global simulation of wind turbine wakes in complex terrain. An accurate simplification for the analysis of wind turbine wakes is the actuator disk technique. Coupling this technique with CFD wind models enables the estimation of wind farm wakes preserving the extraction of axial momentum present inside wind farms. This paper describes the analysis and validation of the elliptical wake model CFDWake 1.0 against experimental data from an operating wind farm located in complex terrain. The analysis also reports whether it is possible or not to superimpose linearly the effect of terrain and wind turbine wakes. It also represents one of the first attempts to observe the performance of engineering models compares in large complex terrain wind farms.
Resumo:
El presente proyecto final de carrera titulado “Modelado de alto nivel con SystemC” tiene como objetivo principal el modelado de algunos módulos de un codificador de vídeo MPEG-2 utilizando el lenguaje de descripción de sistemas igitales SystemC con un nivel de abstracción TLM o Transaction Level Modeling. SystemC es un lenguaje de descripción de sistemas digitales basado en C++. En él hay un conjunto de rutinas y librerías que implementan tipos de datos, estructuras y procesos especiales para el modelado de sistemas digitales. Su descripción se puede consultar en [GLMS02] El nivel de abstracción TLM se caracteriza por separar la comunicación entre los módulos de su funcionalidad. Este nivel de abstracción hace un mayor énfasis en la funcionalidad de la comunicación entre los módulos (de donde a donde van datos) que la implementación exacta de la misma. En los documentos [RSPF] y [HG] se describen el TLM y un ejemplo de implementación. La arquitectura del modelo se basa en el codificador MVIP-2 descrito en [Gar04], de dicho modelo, los módulos implementados son: · IVIDEOH: módulo que realiza un filtrado del vídeo de entrada en la dimensión horizontal y guarda en memoria el video filtrado. · IVIDEOV: módulo que lee de la memoria el vídeo filtrado por IVIDEOH, realiza el filtrado en la dimensión horizontal y escribe el video filtrado en memoria. · DCT: módulo que lee el video filtrado por IVIDEOV, hace la transformada discreta del coseno y guarda el vídeo transformado en la memoria. · QUANT: módulo que lee el video transformado por DCT, lo cuantifica y guarda el resultado en la memoria. · IQUANT: módulo que lee el video cuantificado por QUANT, realiza la cuantificación inversa y guarda el resultado en memoria. · IDCT: módulo que lee el video procesado por IQUANT, realiza la transformada inversa del coseno y guarda el resultado en memoria. · IMEM: módulo que hace de interfaz entre los módulos anteriores y la memoria. Gestiona las peticiones simultáneas de acceso a la memoria y asegura el acceso exclusivo a la memoria en cada instante de tiempo. Todos estos módulos aparecen en gris en la siguiente figura en la que se muestra la arquitectura del modelo: Figura 1. Arquitectura del modelo (VER PDF DEL PFC) En figura también aparecen unos módulos en blanco, dichos módulos son de pruebas y se han añadido para realizar simulaciones y probar los módulos del modelo: · CAMARA: módulo que simula una cámara en blanco y negro, lee la luminancia de un fichero de vídeo y lo envía al modelo a través de una FIFO. · FIFO: hace de interfaz entre la cámara y el modelo, guarda los datos que envía la cámara hasta que IVIDEOH los lee. · CONTROL: módulo que se encarga de controlar los módulos que procesan el vídeo, estos le indican cuando terminan de procesar un frame de vídeo y este módulo se encarga de iniciar los módulos que sean necesarios para seguir con la codificación. Este módulo se encarga del correcto secuenciamiento de los módulos procesadores de vídeo. · RAM: módulo que simula una memoria RAM, incluye un retardo programable en el acceso. Para las pruebas también se han generado ficheros de vídeo con el resultado de cada módulo procesador de vídeo, ficheros con mensajes y un fichero de trazas en el que se muestra el secuenciamiento de los procesadores. Como resultado del trabajo en el presente PFC se puede concluir que SystemC permite el modelado de sistemas digitales con bastante sencillez (hace falta conocimientos previos de C++ y programación orientada objetos) y permite la realización de modelos con un nivel de abstracción mayor a RTL, el habitual en Verilog y VHDL, en el caso del presente PFC, el TLM. ABSTRACT This final career project titled “High level modeling with SystemC” have as main objective the modeling of some of the modules of an MPEG-2 video coder using the SystemC digital systems description language at the TLM or Transaction Level Modeling abstraction level. SystemC is a digital systems description language based in C++. It contains routines and libraries that define special data types, structures and process to model digital systems. There is a complete description of the SystemC language in the document [GLMS02]. The main characteristic of TLM abstraction level is that it separates the communication among modules of their functionality. This abstraction level puts a higher emphasis in the functionality of the communication (from where to where the data go) than the exact implementation of it. The TLM and an example are described in the documents [RSPF] and [HG]. The architecture of the model is based in the MVIP-2 video coder (described in the document [Gar04]) The modeled modules are: · IVIDEOH: module that filter the video input in the horizontal dimension. It saves the filtered video in the memory. · IVIDEOV: module that read the IVIDEOH filtered video, filter it in the vertical dimension and save the filtered video in the memory. · DCT: module that read the IVIDEOV filtered video, do the discrete cosine transform and save the transformed video in the memory. · QUANT: module that read the DCT transformed video, quantify it and save the quantified video in the memory. · IQUANT: module that read the QUANT processed video, do the inverse quantification and save the result in the memory. · IDCT: module that read the IQUANT processed video, do the inverse cosine transform and save the result in the memory. · IMEM: this module is the interface between the modules described previously and the memory. It manage the simultaneous accesses to the memory and ensure an unique access at each instant of time All this modules are included in grey in the following figure (SEE PDF OF PFC). This figure shows the architecture of the model: Figure 1. Architecture of the model This figure also includes other modules in white, these modules have been added to the model in order to simulate and prove the modules of the model: · CAMARA: simulates a black and white video camera, it reads the luminance of a video file and sends it to the model through a FIFO. · FIFO: is the interface between the camera and the model, it saves the video data sent by the camera until the IVIDEOH module reads it. · CONTROL: controls the modules that process the video. These modules indicate the CONTROL module when they have finished the processing of a video frame. The CONTROL module, then, init the necessary modules to continue with the video coding. This module is responsible of the right sequence of the video processing modules. · RAM: it simulates a RAM memory; it also simulates a programmable delay in the access to the memory. It has been generated video files, text files and a trace file to check the correct function of the model. The trace file shows the sequence of the video processing modules. As a result of the present final career project, it can be deduced that it is quite easy to model digital systems with SystemC (it is only needed previous knowledge of C++ and object oriented programming) and it also allow the modeling with a level of abstraction higher than the RTL used in Verilog and VHDL, in the case of the present final career project, the TLM.
Resumo:
Este trabajo tiene por objeto la implementación del modelo ADM1 (The IWA Anaerobic Digestion Model No.1), para evaluar su capacidad de simular el proceso de digestión anaerobia de lodos de agua residual. Para implementar el modelo ADM1 se eligió el software Matlab y su herramienta Simulink, por su capacidad para simular sistemas dinámicos. Los resultados demostraron que la simulación a través de la implementación del modelo ADM1, es capaz de predecir la mayoría de los valores medios correspondientes a los parámetros de control más comunes del proceso de digestión anaerobia que se lleva a cabo en la Estación de depuración de aguas residuales (EDAR) sur de Madrid. Se deduce del estudio que la implementación del modelo ADM1 desarrollada mediante Matlab/Simulink, es capaz de simular el proceso dinámico de digestión anaerobia de cualquier EDAR, si se ajustan algunos de los parámetros del modelo para cada caso concreto. Abstract This work aims at the implementation of The IWA Anaerobic Digestion Model No.1 (ADM1) in order to assess its ability to simulate the anaerobic digestion process of wastewater sludge. A Matlab/Simulink implementation of the ADM1 was developed, the simulations results shows that the ADM1 is able to predict the average values of the most common control parameters of the anaerobic digestion process that takes place in the Waste water treatment plant (WWTP) south of Madrid. It is deduced of the estudy that the Matlab implementation of the ADM1 developed, is able to simulate the anaerobic digestion dynamic process of a WWTP, if some of the model parameters are fitted for each specific case.
Resumo:
The capping of epitaxially grown Quantum Dots (QD) is a key process in the fabrication of devices based on these nanostructures because capping can significantly affect the QDs morphology [3]. We have studied the QD morphology after capping in order to better understand the role of the capping process. We have grown real structures and compared the QD morphology obtained by cross-sectional Scanning Tunneling Microscopy (X-STM) with the morphology of QDs that were virtually grown in simulations based on a Kinetic Monte Carlo model (KMC) [1].
Resumo:
Red blood cells (RBCs), previously fixed with glutaraldehyde, adhere to glass slides coated with fibrinogen. The RBC deposition process on the horizontal glass surface is investigated by analyzing the relative surface covered by the RBCs, as well as the variance of this surface coverage, as a function of the concentration of particles. This study is performed by optical microscopy and image analysis. A model, derived from the classical random sequential adsorption model, has been developed to account for the experimental results. This model highlights the strong influence of the hydrodynamic interactions during the deposition process.
Resumo:
Traditionally, literature estimates the equity of a brand or its extension but it pays little attention to collective brand equity even though collective branding is increasingly used to differentiate the homogenous products of different firms or organizations. We propose an approach that estimates the incremental effect of individual brands (or the contribution of individual brands) on collective brand equity through the various stages of a consumer hierarchical buying choice process in which decisions are nested: “whether to buy”, “what collective brand to buy” and “what individual brand to buy”. This proposal follows the approach of the Random Utility Theory, and it is theoretically argued through the Associative Networks Theory and the cybernetic model of decision making. The empirical analysis carried out in the area of collective brands in Spanish tourism finds a three-stage hierarchical sequence, and estimates the contribution of individual brands to the equity of the collective brands of “Sun, Sea and Sand” and of “World Heritage Cities”.
Resumo:
Comunicación presentada en CIDUI 2010, Congreso Internacional Docencia Universitaria e Innovación, Barcelona, 30 junio-2 julio 2010.
Resumo:
There is a puzzling, little-remarked contradiction in scholarly views of the European Commission. On the one hand, the Commission is seen as the maestro of European integration, gently but persistently guiding both governments and firms toward Brussels. On the other hand, the Commission is portrayed as a headless bunch of bickering fiefdoms who can hardly be bothered by anything but their own in ternecine turf wars. The reason these very different views of the same institution have so seldom come into conflict is quite apparent: EU studies has a set of relatively autonomous and poorly integrated sub fields that work at different levels of analysis. Those scholars holding the "heroic" view of the Com mission are generally focused on the contest between national and supranational levels that character ized the 1992 program and subsequent major steps toward European integration. By contrast, those scholars with the "bureaucratic politics" view are generally authors of case studies or legislative his tories of individual EU directives or decisions. However, the fact that these twO images of the Commis sion are often two ships passing in the night hardly implies that there is no dispute. Clearly both views cannot be right; but then, how can we explain the significant support each enjoys from the empirical record? The CommiSSion, perhaps the single most important supranational body in the world, certainly deserves better than the schizophrenic interpretation the EU studies community has given it. In this paper, I aim to make a contribution toward the unraveling of this paradox. In brief, the argument I make is as follows: the European Commission can be effective in pursuit of its broad integration goals in spite of, and even because of, its internal divisions. The folk wisdom that too many chefs spoil the broth may often be true, but it need not always be so. The paper is organized as follows. 1 begin with an elaboration of the theoretical position briefly out lined above. 1 then tum to a case study from the major Commission efforts to restructure the computer industry in the context of its 1992 program. The computer sector does not merely provide interesting, random illustrations of the hypothesis 1 have advanced. Rather, as Wayne Sandholtz and John Zysman have stressed, the Commission's efforts on informatics formed one of the most crucial parts of the en tire 1992 program, and so the Commission's success in "Europeanizing" these issues had significant ripple effects across the entire European political economy. I conclude with some thoughts on the fol lowing question: now that the Commission has succeeded in bringing the world to its doorstep, does its bureaucratic division still serve a useful purpose?