893 resultados para ENTERPRISE STATISTICS
Resumo:
Economists have argued that regulation is the appropriate approach to maintain output in its economically efficient level in a natural monopoly, and that can be achieved by submitting these companies to regulatory agencies’ decisions. The autonomous agencies are, however, not free in an absolute sense, and it is important to ask what the priorities of the new administration are. One answer is that it is designed to give leeway and powers of discretion to unbiased professionals with expertise in their field. In practice, however, professional experts might often be politically motivated. The objective of this study is to investigate whether political nominations to the presidency of regulatory agencies, rather than technical appointments, affect the level of regulatory risk. In order to achieve this purpose, an event study was performed, where the regulatory risk in a political nomination will be compared to a technical nomination, in terms of abnormal return.
Resumo:
This Master Thesis consists of one theoretical article and one empirical article on the field of Microeconometrics. The first chapter\footnote{We also thank useful suggestions by Marinho Bertanha, Gabriel Cepaluni, Brigham Frandsen, Dalia Ghanem, Ricardo Masini, Marcela Mello, Áureo de Paula, Cristine Pinto, Edson Severnini and seminar participants at São Paulo School of Economics, the California Econometrics Conference 2015 and the 37\textsuperscript{th} Brazilian Meeting of Econometrics.}, called \emph{Synthetic Control Estimator: A Generalized Inference Procedure and Confidence Sets}, contributes to the literature about inference techniques of the Synthetic Control Method. This methodology was proposed to answer questions involving counterfactuals when only one treated unit and a few control units are observed. Although this method was applied in many empirical works, the formal theory behind its inference procedure is still an open question. In order to fulfill this lacuna, we make clear the sufficient hypotheses that guarantee the adequacy of Fisher's Exact Hypothesis Testing Procedure for panel data, allowing us to test any \emph{sharp null hypothesis} and, consequently, to propose a new way to estimate Confidence Sets for the Synthetic Control Estimator by inverting a test statistic, the first confidence set when we have access only to finite sample, aggregate level data whose cross-sectional dimension may be larger than its time dimension. Moreover, we analyze the size and the power of the proposed test with a Monte Carlo experiment and find that test statistics that use the synthetic control method outperforms test statistics commonly used in the evaluation literature. We also extend our framework for the cases when we observe more than one outcome of interest (simultaneous hypothesis testing) or more than one treated unit (pooled intervention effect) and when heteroskedasticity is present. The second chapter, called \emph{Free Economic Area of Manaus: An Impact Evaluation using the Synthetic Control Method}, is an empirical article. We apply the synthetic control method for Brazilian city-level data during the 20\textsuperscript{th} Century in order to evaluate the economic impact of the Free Economic Area of Manaus (FEAM). We find that this enterprise zone had positive significant effects on Real GDP per capita and Services Total Production per capita, but it also had negative significant effects on Agriculture Total Production per capita. Our results suggest that this subsidy policy achieve its goal of promoting regional economic growth, even though it may have provoked mis-allocation of resources among economic sectors.
Resumo:
The power-law size distributions obtained experimentally for neuronal avalanches are an important evidence of criticality in the brain. This evidence is supported by the fact that a critical branching process exhibits the same exponent t~3=2. Models at criticality have been employed to mimic avalanche propagation and explain the statistics observed experimentally. However, a crucial aspect of neuronal recordings has been almost completely neglected in the models: undersampling. While in a typical multielectrode array hundreds of neurons are recorded, in the same area of neuronal tissue tens of thousands of neurons can be found. Here we investigate the consequences of undersampling in models with three different topologies (two-dimensional, small-world and random network) and three different dynamical regimes (subcritical, critical and supercritical). We found that undersampling modifies avalanche size distributions, extinguishing the power laws observed in critical systems. Distributions from subcritical systems are also modified, but the shape of the undersampled distributions is more similar to that of a fully sampled system. Undersampled supercritical systems can recover the general characteristics of the fully sampled version, provided that enough neurons are measured. Undersampling in two-dimensional and small-world networks leads to similar effects, while the random network is insensitive to sampling density due to the lack of a well-defined neighborhood. We conjecture that neuronal avalanches recorded from local field potentials avoid undersampling effects due to the nature of this signal, but the same does not hold for spike avalanches. We conclude that undersampled branching-process-like models in these topologies fail to reproduce the statistics of spike avalanches.
Resumo:
This work involves the organization and content perspectives on Enterprise Content Management (ECM) framework. The case study at the Federal University of Rio Grande do Norte was based on ECM model to analyse the information management provided by the three main administrative systems: The Integrated Management of Academic Activities (SIGAA), Integrated System of Inheritance, and Contracts Administration (SIPAC) and the Integrated System for Administration and Human Resources (SIGRH). A case study protocol was designed to provide greater reliability to research process. Four propositions were examined in order to reach the specific objectives of identification and evaluation of ECM components from UFRN perspective. The preliminary phase provided the guidelines for the data collection. In total, 75 individuals were interviewed. Interviews with four managers directly involved on systems design were recorded (average duration of 90 minutes). The 70 remaining individuals were approached in random way in UFRN s units, including teachers, administrative-technical employees and students. The results showed the presence of many ECM elements in the management of UFRN administrative information. The technological component with higher presence was "management of web content / collaboration". But initiatives of other components (e.g. email and document management) were found and are in continuous improvement. The assessment made use of eQual 4.0 to examine the effectiveness of applications under three factors: usability, quality of information and offered service. In general, the quality offered by the systems was very good and walk side by side with the obtained benefits of ECM strategy adoption in the context of the whole institution
Resumo:
In recent decades the public sector comes under pressure in order to improve its performance. The use of Information Technology (IT) has been a tool increasingly used in reaching that goal. Thus, it has become an important issue in public organizations, particularly in institutions of higher education, determine which factors influence the acceptance and use of technology, impacting on the success of its implementation and the desired organizational results. The Technology Acceptance Model - TAM was used as the basis for this study and is based on the constructs perceived usefulness and perceived ease of use. However, when it comes to integrated management systems due to the complexity of its implementation,organizational factors were added to thus seek further explanation of the acceptance of such systems. Thus, added to the model five TAM constructs related to critical success factors in implementing ERP systems, they are: support of top management, communication, training, cooperation, and technological complexity (BUENO and SALMERON, 2008). Based on the foregoing, launches the following research problem: What factors influence the acceptance and use of SIE / module academic at the Federal University of Para, from the users' perception of teachers and technicians? The purpose of this study was to identify the influence of organizational factors, and behavioral antecedents of behavioral intention to use the SIE / module academic UFPA in the perspective of teachers and technical users. This is applied research, exploratory and descriptive, quantitative with the implementation of a survey, and data collection occurred through a structured questionnaire applied to a sample of 229 teachers and 30 technical and administrative staff. Data analysis was carried out through descriptive statistics and structural equation modeling with the technique of partial least squares (PLS). Effected primarily to assess the measurement model, which were verified reliability, convergent and discriminant validity for all indicators and constructs. Then the structural model was analyzed using the bootstrap resampling technique like. In assessing statistical significance, all hypotheses were supported. The coefficient of determination (R ²) was high or average in five of the six endogenous variables, so the model explains 47.3% of the variation in behavioral intention. It is noteworthy that among the antecedents of behavioral intention (BI) analyzed in this study, perceived usefulness is the variable that has a greater effect on behavioral intention, followed by ease of use (PEU) and attitude (AT). Among the organizational aspects (critical success factors) studied technological complexity (TC) and training (ERT) were those with greatest effect on behavioral intention to use, although these effects were lower than those produced by behavioral factors (originating from TAM). It is pointed out further that the support of senior management (TMS) showed, among all variables, the least effect on the intention to use (BI) and was followed by communications (COM) and cooperation (CO), which exert a low effect on behavioral intention (BI). Therefore, as other studies on the TAM constructs were adequate for the present research. Thus, the study contributed towards proving evidence that the Technology Acceptance Model can be applied to predict the acceptance of integrated management systems, even in public. Keywords: Technology
Resumo:
This thesis treats of a avaliation of a laboral gimnastics program of a electrical energy enterprise from Rio Grande do Norte. The leading aim of this work is to analyse the laboral gimnastics program with emphasis on functional performing and personal changes of staff of different categories. The work shows two divisions: at first, it was carried out the bibliographic study approaching the concepts and historic about the ergonomy and the main focus of study, the laboral gimnastics , performing factors and lifestyle.The second time , it was carried out the field study where was used a questionnaire to a 160 staff population which participate of laboral gymnastics program; then the data were analysed through three statistics analyses: multivaried__ clusters, qui-quadrado and multiple linear regression. The results aim to the confirmation that the laboral gymnastics program developed two years ago on enterprise proposes to the white-collar and operational workers an improving on their functional performing and also, to the changes in some aspects of lifestyle, like the food, leisure and stress level. In conclusion, it was checked that the investiment in life quality programs offer benefits both to the staff and the enterprise which will account with good services and satisfaction of customer. Therefore, it was concluded that the laboral gymnastics while a ergonomic tool and a physical activity, it is an investiment which needs more and more to be strengthened and inserted by many segments of society
Resumo:
The oil industry has several segments that can impact the environment. Among these, produced water which has been highlight in the environmental problem because of the great volume generated and its toxic composition. Those waters are the major source of waste in the oil industry. The composition of the produced water is strongly dependent on the production field. A good example is the wastewater produced on a Petrobras operating unit of Rio Grande do Norte and Ceará (UO-RNCE). A single effluent treatment station (ETS) of this unit receives effluent from 48 wells (onshore and offshore), which leads a large fluctuations in the water quality that can become a complicating factor for future treatment processes. The present work aims to realize a diagnosis of a sample of produced water from the OU - RNCE in compliance to certain physical and physico-chemical parameters (chloride concentration, conductivity, dissolved oxygen, pH, TOG (oil & grease), nitrate concentration, turbidity, salinity and temperature). The analysis of the effluent is accomplished by means of a MP TROLL 9500 Multiparameter probe, a TOG/TPH Infracal from Wilks Enterprise Corp. - Model HATR - T (TOG) and a MD-31 condutivimeter of Digimed. Results were analyzed by univariated and multivariated analysis (principal component analysis) associated statistical control charts. The multivariate analysis showed a negative correlation between dissolved oxygen and turbidity (-0.55) and positive correlations between salinity and chloride (1), conductivity, chloride and salinity (0.70). Multivariated analysis showed there are seven principal components which can explain the variability of the parameters. The variables, salinity, conductivity and chloride were the most important variables, with, higher sampling variance. Statistical control charts have helped to establish a general trend between the physical and chemical evaluated parameters
Resumo:
Sistemas Integrados de Gestão ou Enterprise Resources Planning - ERP possibilitam o processamento das informações necessárias em uma empresa usando um único banco de dados. Muito tem se escrito sobre este tipo de software, abordando questões como o alto custo da aquisição de licenças, e a dependência de consultoria para a sua adaptação e implantação nas empresas. Atualmente vem crescendo o desenvolvimento e uso de ERP Livre de Código Aberto (FOS-ERP). Porém verifica-se que este tipo de sistema ainda não é suficientemente explorado, mesmo no meio acadêmico. Este artigo relata alguns trabalhos publicados sobre o assunto e levanta questões que devem ser tratadas por pesquisadores e demais interessados para adequar e viabilizar o uso desses sistemas conforme a realidade nacional. Assim, após uma introdução ao tema, são apresentadas algumas diferenças entre o FOS-ERP e seus equivalentes proprietários (Proprietary ERP ou P-ERP) em termos de modelos de negócios, seleção, customização e evolução. em seguida são elencados alguns desafios e oportunidades que o FOS-ERP pode oferecer a usuários, fornecedores, pesquisadores e colaboradores individuais. Concluindo, este artigo busca ampliar a discussão sobre FOS-ERP, destacando fatores tais como seu potencial de inovação tecnológica e estratégias de negócios.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Exact and closed-form expressions for the level crossing rate and average fade duration are presented for equal gain combining and maximal ratio combining schemes, assuming an arbitrary number of independent branches in a Rayleigh environment. The analytical results are thoroughly validated by simulation.
Resumo:
Exact and closed-form expressions for the level crossing rate and average fade duration are presented for the M branch pure selection combining (PSC), equal gain combining (EGC), and maximal ratio combining (MRC) techniques, assuming independent branches in a Nakagami environment. The analytical results are thoroughly validated by reducing the general case to some special cases, for which the solutions are known, and by means of simulation for the more general case. The model developed here is general and can be easily applied to other fading statistics (e.g., Rice).
Resumo:
In the quark model of the nucleon, the Fermi statistics of the elementary constituents can influence significantly the properties of multinucleon bound systems. In the Skyrme model, on the other hand, the basic quanta are bosons, so that qualitatively different statistics effects can be expected a priori. In order to illustrate this point, we construct schematic one-dimensional quark and soliton models which yield fermionic nucleons with identical baryon densities. We then compare the baryon densities of a two-nucleon bound state in both models. Whereas in the quark model the Pauli principle for quarks leads to a depletion of the density in the central region of the nucleus, the soliton model predicts a slight increase of the density in that region, due to the bosonic statistics of the meson-field quanta.
Resumo:
Tsallis postulated a generalized form for entropy and give rise to a new statistics now known as Tsallis statistics. In the present work, we compare the Tsallis statistics with the gradually truncated Levy flight, and discuss the distribution of an economical index-the Standard and Poor's 500-using the values of standard deviation as calculated by our model. We find that both statistics give almost the same distribution. Thus we feel that gradual truncation of Levy distribution, after certain critical step size for describing complex systems, is a requirement of generalized thermodynamics or similar. The gradually truncated Levy flight is based on physical considerations and bring a better physical picture of the dynamics of the whole system. Tsallis statistics gives a theoretical support. Both statistics together can be utilized for the development of a more exact portfolio theory or to understand better the complexities in human and financial behaviors. A comparison of both statistics is made. (C) 2002 Published by Elsevier B.V. B.V.