711 resultados para real shocks panel data
Resumo:
After earthquakes, licensed inspectors use the established codes to assess the impact of damage on structural elements. It always takes them days to weeks. However, emergency responders (e.g. firefighters) must act within hours of a disaster event to enter damaged structures to save lives, and therefore cannot wait till an official assessment completes. This is a risk that firefighters have to take. Although Search and Rescue Organizations offer training seminars to familiarize firefighters with structural damage assessment, its effectiveness is hard to guarantee when firefighters perform life rescue and damage assessment operations together. Also, the training is not available to every firefighter. The authors therefore proposed a novel framework that can provide firefighters with a quick but crude assessment of damaged buildings through evaluating the visible damage on their critical structural elements (i.e. concrete columns in the study). This paper presents the first step of the framework. It aims to automate the detection of concrete columns from visual data. To achieve this, the typical shape of columns (long vertical lines) is recognized using edge detection and the Hough transform. The bounding rectangle for each pair of long vertical lines is then formed. When the resulting rectangle resembles a column and the material contained in the region of two long vertical lines is recognized as concrete, the region is marked as a concrete column surface. Real video/image data are used to test the method. The preliminary results indicate that concrete columns can be detected when they are not distant and have at least one surface visible.
Resumo:
This paper describes an algorithm for scheduling packets in real-time multimedia data streams. Common to these classes of data streams are service constraints in terms of bandwidth and delay. However, it is typical for real-time multimedia streams to tolerate bounded delay variations and, in some cases, finite losses of packets. We have therefore developed a scheduling algorithm that assumes streams have window-constraints on groups of consecutive packet deadlines. A window-constraint defines the number of packet deadlines that can be missed in a window of deadlines for consecutive packets in a stream. Our algorithm, called Dynamic Window-Constrained Scheduling (DWCS), attempts to guarantee no more than x out of a window of y deadlines are missed for consecutive packets in real-time and multimedia streams. Using DWCS, the delay of service to real-time streams is bounded even when the scheduler is overloaded. Moreover, DWCS is capable of ensuring independent delay bounds on streams, while at the same time guaranteeing minimum bandwidth utilizations over tunable and finite windows of time. We show the conditions under which the total demand for link bandwidth by a set of real-time (i.e., window-constrained) streams can exceed 100% and still ensure all window-constraints are met. In fact, we show how it is possible to guarantee worst-case per-stream bandwidth and delay constraints while utilizing all available link capacity. Finally, we show how best-effort packets can be serviced with fast response time, in the presence of window-constrained traffic.
Resumo:
In this paper, we extend the heterogeneous panel data stationarity test of Hadri [Econometrics Journal, Vol. 3 (2000) pp. 148–161] to the cases where breaks are taken into account. Four models with different patterns of breaks under the null hypothesis are specified. Two of the models have been already proposed by Carrion-i-Silvestre et al.[Econometrics Journal,Vol. 8 (2005) pp. 159–175]. The moments of the statistics corresponding to the four models are derived in closed form via characteristic functions.We also provide the exact moments of a modified statistic that do not asymptotically depend on the location of the break point under the null hypothesis. The cases where the break point is unknown are also considered. For the model with breaks in the level and no time trend and for the model with breaks in the level and in the time trend, Carrion-i-Silvestre et al. [Econometrics Journal, Vol. 8 (2005) pp. 159–175]showed that the number of breaks and their positions may be allowed to differ acrossindividuals for cases with known and unknown breaks. Their results can easily be extended to the proposed modified statistic. The asymptotic distributions of all the statistics proposed are derived under the null hypothesis and are shown to be normally distributed. We show by simulations that our suggested tests have in general good performance in finite samples except the modified test. In an empirical application to the consumer prices of 22 OECD countries during the period from 1953 to 2003, we found evidence of stationarity once a structural break and cross-sectional dependence are accommodated.
Resumo:
Wavelet transforms provide basis functions for time-frequency analysis and have properties that are particularly useful for compression of analogue point on wave transient and disturbance power system signals. This paper evaluates the reduction properties of the wavelet transform using real power system data and discusses the application of the reduction method for information transfer in network communications.
Resumo:
In this paper, we re-examine two important aspects of the dynamics of relative primary commodity prices, namely the secular trend and the short run volatility. To do so, we employ 25 series, some of them starting as far back as 1650 and powerful panel data stationarity tests that allow for endogenous multiple structural breaks. Results show that all the series are stationary after allowing for endogenous multiple breaks. Test results on the Prebisch–Singer hypothesis, which states that relative commodity prices follow a downward secular trend, are mixed but with a majority of series showing negative trends. We also make a first attempt at identifying the potential drivers of the structural breaks. We end by investigating the dynamics of the volatility of the 25 relative primary commodity prices also allowing for endogenous multiple breaks. We describe the often time-varying volatility in commodity prices and show that it has increased in recent years.
Resumo:
Background: Real-time quantitative PCR (qPCR) is a highly sensitive and specific method which is used extensively for determining gene expression profiles in a variety of cell and tissue types. In order to obtain accurate and reliable gene expression quantification, qPCR data are generally normalised against so-called reference or housekeeping genes. Ideally, reference genes should have abundant and stable RNA transcriptomes under the experimental conditions employed. However, reference genes are often selected rather arbitrarily and indeed some have been shown to have variable expression in a variety of in vitro experimental conditions.
Objective: The objective of the current study was to investigate reference gene expression in human periodontal ligament (PDL) cells in response to treatment with lipopolysaccharide (LPS).
Method: Primary human PDL cells were grown in Dulbecco’s Modified Eagle Medium with L-glutamine supplemented with 10% fetal bovine serum, 100UI/ml penicillin and 100µg/ml streptomycin. RNA was isolated using the RNeasy Mini Kit (Qiagen) and reverse transcribed using the QuantiTect Reverse Transcription Kit (Qiagen). The expression of a total of 19 reference genes was studied in the presence and absence of LPS treatment using the Roche Reference Gene Panel. Data were analysed using NormFinder and Bestkeeper validation programs.
Results: Treatment of human PDL cells with LPS resulted in changes in expression of several commonly used reference genes, including GAPDH. On the other hand the reference genes β-actin, G6PDH and 18S were identified as stable genes following LPS treatment.
Conclusion: Many of the reference genes studied were robust to LPS treatment (up to 100 ng/ml). However several commonly employed reference genes, including GAPDH varied with LPS treatment, suggesting they would not be ideal candidates for normalisation in qPCR gene expression studies.
Resumo:
Os resultados que a atual crise, iniciada em 2008, de cariz inicialmente financeira mas que teve, e tem, importantes repercursões na economia real a nivel mundial, tornaram protagonista a discussão sobre a possível relação entre o Sistema Financeiro e o Crescimento Económico. Os varios países afetados, em particular os países da União Europeia, têm reunido esforços para reformular as políticas económicas e financeiras como forma de recuperar as economias e evitar futuras crises. Os custos e efeitos da crise fizeram emergir vários estudos que põem em causa a correlação assumida como sempre positiva entre o desenvolvimento da vertente financeira e o crescimento da economia real que até aqui era tida como certa. O principal objetivo desta dissertação é analisar a forma como o rápido crescimento e uma sobredimensão do Setor Financeiro, motivado essencialmente pela desrugulação financeira e pelo boom financeiro registado a partir dos anos 90, podem influenciar o crescimento da economia real, fazendo emergir a tese da existência de um limiar a partir do qual o efeito das finanças no crescimento económico pode ser perverso. A literatura sobre a relação entre o Sistema Financeiro e o Crescimento Económico não é unívoca quanto ao sinal entre as duas vertentes, hevendo uma leitura recente em que a taxa de crescimento e a excessiva dimensão financeira, desviando demasiados recursos dos outros setores da economia, podem ser um entrave à sustentação do crescimento da economia real. Tentando confirmar ou infirmar a existência de um limiar de crescimento da economia real decorrente da vertente financeira (transmitido, sobretudo, pelo crédito ao setor privado e pelo emprego financeiro), foi feita uma análise econométrica, com dados em painel com base nos países da União Europeia no período de 1990 a 2010. Os resultados obtidos apontam para a confirmação da hipótese da existência de um limiar de crescimento, em particular a relação “parabólica” existente entre o Financiamento e o Crescimento. Adicionalmente, o estudo revela uma influência negativa dos gastos públicos sobre o crescimento.
Resumo:
This paper examines the empirical relationship between financial intermediation and economic growth using cross-country and panel data regressions for 69 developing countries for the 1960-1990 period. The main results are : (i) financial development is a significant determinant of economic growth, as it has been shown in cross-sectional regressions; (ii) financial markets cease to exert any effect on real activity when the temporal dimension is introduced in the regressions. The paradox may be explained, in the case of developing countries, by the lack of an entrepreneurial private sector capable to transform the available funds into profitable projects; (iii) the effect of financial development on economic growth is channeled mainly through an increase in investment efficiency.
Resumo:
Recent work shows that a low correlation between the instruments and the included variables leads to serious inference problems. We extend the local-to-zero analysis of models with weak instruments to models with estimated instruments and regressors and with higher-order dependence between instruments and disturbances. This makes this framework applicable to linear models with expectation variables that are estimated non-parametrically. Two examples of such models are the risk-return trade-off in finance and the impact of inflation uncertainty on real economic activity. Results show that inference based on Lagrange Multiplier (LM) tests is more robust to weak instruments than Wald-based inference. Using LM confidence intervals leads us to conclude that no statistically significant risk premium is present in returns on the S&P 500 index, excess holding yields between 6-month and 3-month Treasury bills, or in yen-dollar spot returns.
Resumo:
Most panel unit root tests are designed to test the joint null hypothesis of a unit root for each individual series in a panel. After a rejection, it will often be of interest to identify which series can be deemed to be stationary and which series can be deemed nonstationary. Researchers will sometimes carry out this classification on the basis of n individual (univariate) unit root tests based on some ad hoc significance level. In this paper, we demonstrate how to use the false discovery rate (FDR) in evaluating I(1)=I(0) classifications based on individual unit root tests when the size of the cross section (n) and time series (T) dimensions are large. We report results from a simulation experiment and illustrate the methods on two data sets.
Resumo:
Este documento examina la hipótesis de sostenibilidad fiscal para 8 países de Latinoamérica. A partir de un modelo de datos panel, se determina si los ingresos y gasto primario de los Gobiernos entre 1960 - 2009 están cointegrados, es decir, si son sostenibles a largo plazo. Para esto, se utilizaron pruebas de raíz unitaria y cointegración de segunda generación con datos panel macroeconómicos, lo que permite tener en cuenta la dependencia cruzada entre los países, así como los posibles quiebres estructurales en la relación que estén determinados de manera endógena; en particular, se usan la prueba de estacionariedad de Hadri y Rao (2008) y la prueba de cointegración de Westerlund (2006). Como resultado del análisis se encontró evidencia empírica de que en el período bajo estudio el déficit primario en los 8 países latinoamericanos es sostenible pero en sentido débil.
Resumo:
El proyecto de investigación parte de la dinámica del modelo de distribución tercerizada para una compañía de consumo masivo en Colombia, especializada en lácteos, que para este estudio se ha denominado “Lactosa”. Mediante datos de panel con estudio de caso, se construyen dos modelos de demanda por categoría de producto y distribuidor y mediante simulación estocástica, se identifican las variables relevantes que inciden sus estructuras de costos. El problema se modela a partir del estado de resultados por cada uno de los cuatro distribuidores analizados en la región central del país. Se analiza la estructura de costos y el comportamiento de ventas dado un margen (%) de distribución logístico, en función de las variables independientes relevantes, y referidas al negocio, al mercado y al entorno macroeconómico, descritas en el objeto de estudio. Entre otros hallazgos, se destacan brechas notorias en los costos de distribución y costos en la fuerza de ventas, pese a la homogeneidad de segmentos. Identifica generadores de valor y costos de mayor dispersión individual y sugiere uniones estratégicas de algunos grupos de distribuidores. La modelación con datos de panel, identifica las variables relevantes de gestión que inciden sobre el volumen de ventas por categoría y distribuidor, que focaliza los esfuerzos de la dirección. Se recomienda disminuir brechas y promover desde el productor estrategias focalizadas a la estandarización de procesos internos de los distribuidores; promover y replicar los modelos de análisis, sin pretender remplazar conocimiento de expertos. La construcción de escenarios fortalece de manera conjunta y segura la posición competitiva de la compañía y sus distribuidores.
Resumo:
El terrorismo en la actualidad es considerado como uno de los conceptos más controversiales en los campos social, académico y político. El término se empieza a utilizar después de la Revolución Francesa, pero recientemente, a raíz de los atentados del 11 de septiembre de 2001, ha tomado suma relevancia y ha motivado numerosas investigaciones para tratar de entender qué es terrorismo. Aunque a la fecha existen varias revisiones sistemáticas, este trabajo tiene como propósito revisar, agrupar y concretar diferentes teorías y conceptos formulados por los autores que han trabajado sobre el concepto de “terrorismo” con el fin de entender las implicaciones de su utilización en el discurso, y cómo esto afecta la dinámica interna de las sociedades en relación con la violencia, las creencias, los estereotipos entre otros elementos. Para lograrlo, se revisaron 56 artículos, publicados entre los años 1985 y 2013; 10 fuentes secundarias entre noticias y artículos de periódicos correspondientes a los años 1995-2013 y 10 estudios estadísticos cuyos resultados nos aportan a la comprensión del tema en cuestión. La búsqueda se limitó al desarrollo histórico del terrorismo, sus diferentes dimensiones y el concepto social de la realidad de terrorismo. Los hallazgos demuestran que la palabra “terrorismo” constituye un concepto que como tal es un vehículo lingüístico que puede ser utilizado con fines, estratégicos movilizando al público conforme a través del discurso e intereses políticos, destacando la necesidad de estudiar las implicaciones psicológicas y sociales de su uso.
Resumo:
En este artículo se estudia la posibilidad de introducir seguros de desempleo en Colombia. En una primera parte, se propone una revisión de literatura de los seguros de desempleo en la cual se exponen las ventajas generadas por una cobertura contra este riesgo, así como sus inconvenientes. En una segunda parte, se estudian varios escenarios para introducir seguros de desempleo en Colombia. Después de haber presentado el contexto del mercado laboral y de las normas que lo vigilan, se proponen varios diseños que abordan la gestión y la administración del riesgo de desempleo en Colombia. Igualmente se presentan algunas consideraciones teóricas para la valoración del costo del aseguramiento, las cuales incorporan los efectos del riesgo moral sobre la duración y la incidencia del desempleo.
Resumo:
We propose and estimate a financial distress model that explicitly accounts for the interactions or spill-over effects between financial institutions, through the use of a spatial continuity matrix that is build from financial network data of inter bank transactions. Such setup of the financial distress model allows for the empirical validation of the importance of network externalities in determining financial distress, in addition to institution specific and macroeconomic covariates. The relevance of such specification is that it incorporates simultaneously micro-prudential factors (Basel 2) as well as macro-prudential and systemic factors (Basel 3) as determinants of financial distress. Results indicate network externalities are an important determinant of financial health of a financial institutions. The parameter that measures the effect of network externalities is both economically and statistical significant and its inclusion as a risk factor reduces the importance of the firm specific variables such as the size or degree of leverage of the financial institution. In addition we analyze the policy implications of the network factor model for capital requirements and deposit insurance pricing.