871 resultados para Panel data probit model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objetivo deste estudo é fazer uma análise da relação entre o erro de previsão dos analistas de mercado quanto à rentabilidade das empresas listadas na BM&FBOVESPA S.A. (Bovespa) e os requerimentos de divulgação do International Financial Reporting Standards (IFRS). Isto foi feito através da regressão do erro de previsão dos analistas, utilizando a metodologia de dados em painel no ano de implantação do IFRS no Brasil, 2010, e, complementarmente em 2012, para referenciamento desses dados. Partindo desse pressuposto, foi determinado o erro de previsão das empresas listadas na Bovespa através de dados de rentabilidade (índice de lucro por ação/earnings per share) previstos e realizados, disponíveis nas bases de dados I/B/E/S Earnings Consensus Information, providos pela plataforma Thomson ONE Investment Banking e Economática Pro®, respectivamente. Os resultados obtidos indicam uma relação negativa entre o erro de previsão e o cumprimento dos requisitos de divulgação do IFRS, ou seja, quanto maior a qualidade nas informações divulgadas, menor o erro de previsão dos analistas. Portanto, esses resultados sustentam a perspectiva de que o grau de cumprimento das normas contábeis é tão ou mais importante do que as próprias normas. Adicionalmente, foi verificado que quando a empresa listada na BM&FBOVESPA é vinculada a Agência Reguladora, seu erro de previsão não é alterado. Por fim, esses resultados sugerem que é importante que haja o aprimoramento dos mecanismos de auditoria das firmas quanto ao cumprimento dos requerimentos normativos de divulgação, tais como: penalidades pela não observância da norma (enforcement), estruturas de governança corporativa e auditorias interna e externa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Utilizando dados em Painel sobre mais de 15.000 observações de empresas brasileiras, tanto de capital aberto quanto de capital fechado no período entre 2010 a 2014, o presente estudo examinou a existência de restrição ao crédito e sua relação com investimentos em ativos fixos e o papel desempenhado pelo capital de giro enquanto ferramenta gerencial nas decisões de investimentos em ativos fixos. Para tanto, foram utilizadas duas metodologias-base: o estudo desenvolvido por Almeida e Campello (2007), que inovou ao incluir variáveis que controlam algumas das maiores críticas aos estudos relacionados à restrição ao crédito e o estudo de Ding, Guariglia e Knight (2011) que testou a relação de investimentos em capital de giro com investimentos em ativos fixos. Os resultados encontrados neste trabalho apontam que, de maneira geral, as empresas brasileiras sofrem com restrições ao crédito e também, que as empresas que investem mais em capital de giro demonstram menor sensibilidade do investimento em ativo fixo ao fluxo de caixa, porém, não conseguem traduzir isso em taxas maiores de investimentos.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Master Thesis consists of one theoretical article and one empirical article on the field of Microeconometrics. The first chapter\footnote{We also thank useful suggestions by Marinho Bertanha, Gabriel Cepaluni, Brigham Frandsen, Dalia Ghanem, Ricardo Masini, Marcela Mello, Áureo de Paula, Cristine Pinto, Edson Severnini and seminar participants at São Paulo School of Economics, the California Econometrics Conference 2015 and the 37\textsuperscript{th} Brazilian Meeting of Econometrics.}, called \emph{Synthetic Control Estimator: A Generalized Inference Procedure and Confidence Sets}, contributes to the literature about inference techniques of the Synthetic Control Method. This methodology was proposed to answer questions involving counterfactuals when only one treated unit and a few control units are observed. Although this method was applied in many empirical works, the formal theory behind its inference procedure is still an open question. In order to fulfill this lacuna, we make clear the sufficient hypotheses that guarantee the adequacy of Fisher's Exact Hypothesis Testing Procedure for panel data, allowing us to test any \emph{sharp null hypothesis} and, consequently, to propose a new way to estimate Confidence Sets for the Synthetic Control Estimator by inverting a test statistic, the first confidence set when we have access only to finite sample, aggregate level data whose cross-sectional dimension may be larger than its time dimension. Moreover, we analyze the size and the power of the proposed test with a Monte Carlo experiment and find that test statistics that use the synthetic control method outperforms test statistics commonly used in the evaluation literature. We also extend our framework for the cases when we observe more than one outcome of interest (simultaneous hypothesis testing) or more than one treated unit (pooled intervention effect) and when heteroskedasticity is present. The second chapter, called \emph{Free Economic Area of Manaus: An Impact Evaluation using the Synthetic Control Method}, is an empirical article. We apply the synthetic control method for Brazilian city-level data during the 20\textsuperscript{th} Century in order to evaluate the economic impact of the Free Economic Area of Manaus (FEAM). We find that this enterprise zone had positive significant effects on Real GDP per capita and Services Total Production per capita, but it also had negative significant effects on Agriculture Total Production per capita. Our results suggest that this subsidy policy achieve its goal of promoting regional economic growth, even though it may have provoked mis-allocation of resources among economic sectors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation uses an empirical gravity equation approach to study the relationship between nonreciprocal trade agreements (NRTAs) and members’ trade flows. Estimations relate bilateral imports to trade policy variables using a very comprehensive dataset with over fifty years of data. Results show that meager average trade effects exist only if members are excluded from the world trading system or if they are very poor. As trade flows between NRTA members are already rising before their creation, results also suggest a strong endogeneity concerning their formation. Moreover, estimations show that uncertainty and discretion tend to critically hinder NRTA’s performance. On the other hand, reciprocal trade agreements show the opposite pattern regardless of members’ income status.Encouraging developing countries’ openness to trade through reciprocal liberalization emerges consequently as a possible policy implication.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Brazil, the selection of school principals is set in a decentralized manner by each state and city, such that processes may vary with time for a specific locality. In the state of Bahia, school principals were appointed by a higher political hierarchy until 2008, when schools under state administration started selecting principals by elections. The main goal of this work is to evaluate whether changing this specific rule affected students proficiency levels. This is achieved by using a panel data and difference-in-differences approachs that compares state schools (treatment group) to city schools (control group) that did not face a selection rule change and thus kept having their principals politically appointed. The databases used are Prova Brasil 2007, 2009 and 2011, the first one prior and the other two former to the policy change. Our results suggest that students attending schools with principals that are selected and elected have slightly lower mean proficiency levels both in mathematics and in portuguese exams than those attending schools with appointed principals. This result, according to the literature, could be related to perverse effects of selecting school administrators by vote, such as corporatism, clientelism and politicization of the school environment

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the absence of the selective availability, which was turned off on May 1, 2000, the ionosphere can be the largest source of error in GPS positioning and navigation. Its effects on GPS observable cause a code delays and phase advances. The magnitude of this error is affected by the local time of the day, season, solar cycle, geographical location of the receiver and Earth's magnetic field. As it is well known, the ionosphere is the main drawback for high accuracy positioning, when using single frequency receivers, either for point positioning or relative positioning of medium and long baselines. The ionosphere effects were investigated in the determination of point positioning and relative positioning using single frequency data. A model represented by a Fourier series type was implemented and the parameters were estimated from data collected at the active stations of RBMC (Brazilian Network for Continuous Monitoring of GPS satellites). The data input were the pseudorange observables filtered by the carrier phase. Quality control was implemented in order to analyse the adjustment and to validate the significance of the estimated parameters. Experiments were carried out in the equatorial region, using data collected from dual frequency receivers. In order to validate the model, the estimated values were compared with ground truth. For point and relative positioning of baselines of approximately 100 km, the values of the discrepancies indicated an error reduction better than 80% and 50% respectively, compared to the processing without the ionospheric model. These results give an indication that more research has to be done in order to provide support to the L1 GPS users in the Equatorial region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of strategies for structural health monitoring (SHM) has become increasingly important because of the necessity of preventing undesirable damage. This paper describes an approach to this problem using vibration data. It involves a three-stage process: reduction of the time-series data using principle component analysis (PCA), the development of a data-based model using an auto-regressive moving average (ARMA) model using data from an undamaged structure, and the classification of whether or not the structure is damaged using a fuzzy clustering approach. The approach is applied to data from a benchmark structure from Los Alamos National Laboratory, USA. Two fuzzy clustering algorithms are compared: fuzzy c-means (FCM) and Gustafson-Kessel (GK) algorithms. It is shown that while both fuzzy clustering algorithms are effective, the GK algorithm marginally outperforms the FCM algorithm. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of middleware technology in various types of systems, in order to abstract low-level details related to the distribution of application logic, is increasingly common. Among several systems that can be benefited from using these components, we highlight the distributed systems, where it is necessary to allow communications between software components located on different physical machines. An important issue related to the communication between distributed components is the provision of mechanisms for managing the quality of service. This work presents a metamodel for modeling middlewares based on components in order to provide to an application the abstraction of a communication between components involved in a data stream, regardless their location. Another feature of the metamodel is the possibility of self-adaptation related to the communication mechanism, either by updating the values of its configuration parameters, or by its replacement by another mechanism, in case of the restrictions of quality of service specified are not being guaranteed. In this respect, it is planned the monitoring of the communication state (application of techniques like feedback control loop), analyzing performance metrics related. The paradigm of Model Driven Development was used to generate the implementation of a middleware that will serve as proof of concept of the metamodel, and the configuration and reconfiguration policies related to the dynamic adaptation processes. In this sense was defined the metamodel associated to the process of a communication configuration. The MDD application also corresponds to the definition of the following transformations: the architectural model of the middleware in Java code, and the configuration model to XML

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present model results for the two-halo-neutron correlation functions, C-nn, for the dissociation process of light exotic nuclei modelled as two neutrons and a core. A minimum is predicted for C-nn as a function of the relative momentum of the two neutrons, p(nn), due to the coherence of the neutrons in the halo and final state interaction. Studying the systems Be-14, Li-11, and He-6 within this model, we show that the numerical asymptotic limit, C-nn-> 1, occurs only for p(nn)greater than or similar to 400 MeV/c, while such limit is reached for much lower values of p(nn) in an independent particle model as the one used in the analysis of recent experimental data. Our model is consistent with data once the experimental correlation function is appropriately normalized.