892 resultados para ENTERPRISE STATISTICS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This presentation will outline an effective model for a Hybrid Statistics course. The course continues to be very successful, incorporating on-line instruction, testing, blogs, and above all, a data analysis project driven trajectory motivating students to engage more aggressively in the class and rise up to the challenge of writing an original research paper. Obstacles, benefits and successes of this endeavor will be addressed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective for this degree project is to implement an Application Availability Monitoring (AAM) system named Softek EnView for Fujitsu Services. The aim of implementing the AAM system is to proactively identify end user performance problems, such as application and site performance, before the actual end users experience them. No matter how well applications and sites are designed and nomatter how well they meet business requirements, they are useless to the end users if the performance is slow and/or unreliable. It is important for the customers to find out whether the end user problems are caused by the network or application malfunction. The Softek EnView was comprised of the following EnView components: Robot, Monitor, Reporter, Collector and Repository. The implemented system, however, is designed to use only some of these EnView elements: Robot, Reporter and depository. Robots can be placed at any key user location and are dedicated to customers, which means that when the number of customers increases, at the sametime the amount of Robots will increase. To make the AAM system ideal for the company to use, it was integrated with Fujitsu Services’ centralised monitoring system, BMC PATROL Enterprise Manager (PEM). That was actually the reason for deciding to drop the EnView Monitor element. After the system was fully implemented, the AAM system was ready for production. Transactions were (and are) written and deployed on Robots to simulate typical end user actions. These transactions are configured to run with certain intervals, which are defined collectively with customers. While they are driven against customers’ applicationsautomatically, transactions collect availability data and response time data all the time. In case of a failure in transactions, the robot immediately quits the transactionand writes detailed information to a log file about what went wrong and which element failed while going through an application. Then an alert is generated by a BMC PATROL Agent based on this data and is sent to the BMC PEM. Fujitsu Services’ monitoring room receives the alert, reacts to it according to the incident management process in ITIL and by alerting system specialists on critical incidents to resolve problems. As a result of the data gathered by the Robots, weekly reports, which contain detailed statistics and trend analyses of ongoing quality of IT services, is provided for the Customers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Architectural description languages (ADLs) are used to specify high-level, compositional view of a software application. ADLs usually come equipped with a rigourous state-transition style semantics, facilitating specification and analysis of distributed and event-based systems. However, enterprise system architectures built upon newer middleware (implementations of Java’s EJB specification, or Microsoft’s COM+/ .NET) require additional expressive power from an ADL. The TrustME ADL is designed to meet this need. In this paper, we describe several aspects of TrustME that facilitate specification and anlysis of middleware-based architectures for the enterprise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este documento constitui uma dissertação de mestrado, requisito parcial para a obtenção do grau de Mestre em Administração pela Universidade Federal do Rio Grande do Sul. O tema da pesquisa é o relacionamento existente entre as características técnicas de um projeto de sistema de informação e apoio à decisão e os comportamentos dos usuários no seu uso. O objetivo é desenvolver e apresentar um modelo conceitual de EIS (“Enterprise Information Systems”), a partir da literatura, das tendências tecnológicas e de estudos de caso, que identifique características para comportamentos proativos dos usuários na recuperação de informações. Adotou-se o conceito de comportamento proativo na recuperação de informações como a combinação das categorias exploração de dados e busca focada. Entre os principais resultados, pode-se destacar a definição de categorias relacionadas com as características dos sistemas - flexibilidade, integração e apresentação - e de categorias relacionadas com os comportamentos dos usuários na recuperação de informações - exploração de dados e busca focada, bem como a apresentação de um modelo conceitual para sistemas EIS. Pode-se destacar também a exploração de novas técnicas para análise qualitativa de dados, realizada com o objetivo de buscar uma maior preservação do contexto nos estudos de caso.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A tecnologia de informação e, mais especificamente, os Enterprise Systems (ESs), têm recebido enorme atenção e investimentos maciços nas organizações. Por se tratar de uma tecnologia relativamente nova, os estudos sobre ESs não têm claro os seus impactos na transformação organizacional em termos das pessoas e cultura numa perspectiva sociotécnica, estruturalista e humanista. Por meio de um estudo qualitativo fruto de bricolagem com vários ambientes de análise, características de sujeitos e com a ferramenta de análise do conteúdo do discurso, puderam-se confirmar as hipóteses sobre os impactos dos ESs quanto à melhoria nas relações pessoa/pessoa, pessoa/alta gerência e pessoa/trabalho, base do levantamento "Um Excelente Lugar para se Trabalhar", bem como, impactos em outras variáveis analisadas. O estudo estendeu-se um pouco mais e verificou as habilidades dos implantadores dos sistemas e desenvolvedores dos softwares, as características das empresas com relação à fase de maturidade evolutiva e os objetivos pouco claros de implantação dos ESs como fatores dificultadores de implantação, resultando em sub-utilização e resistência e perda de recursos. O trabalho realiza também uma ampla pesquisa bibliográfica que evidencia que ainda não há um grau médio de clareza sobre as reais características do fenômeno "implantação de ES", suas justificativas, problemas, objetivos, fatores determinantes de sucesso, etc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aborda o impacto da implantação dos sistemas ERP - Enterprise Resource Planning- sobre a Contabilidade e sobre o Papel do Contador Gerencial. Analisa quais foram as alterações estruturais, - funcionais e de responsabilidades ocorridas com a contabilidade gerencial, após a implantação do ERP, bem como as alterações ocorridas no papel do contador gerencial, nas suas funções, na sua importância e no seu papel nas organizações. Identifica, em função desse novo papel, quais são as novas habilidades que o contador gerencial deverá incorporar

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existem diversos tipos de riscos que uma organização pode correr. Normalmente eles são gerenciados isoladamente em casa unidade ou divisão. Diante do cenário mais volátil da nova economia, é proposto um modelo de gerenciamento de risco que busca integrar todos os diferentes tipos de risco, chamado de Enterprise-Wide Risk Management (EWRM). O modelo organiza a gestão de riscos, sob a ótica de portfolio, interferindo na estratégia da empresa e criando valor ao acionista. O trabalho mostra a evolução dos modelos de gerenciamento de risco até o EWRM, propondo uma metodologia para sua implementação assim como os fatores chave para seu sucesso

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Economists have argued that regulation is the appropriate approach to maintain output in its economically efficient level in a natural monopoly, and that can be achieved by submitting these companies to regulatory agencies’ decisions. The autonomous agencies are, however, not free in an absolute sense, and it is important to ask what the priorities of the new administration are. One answer is that it is designed to give leeway and powers of discretion to unbiased professionals with expertise in their field. In practice, however, professional experts might often be politically motivated. The objective of this study is to investigate whether political nominations to the presidency of regulatory agencies, rather than technical appointments, affect the level of regulatory risk. In order to achieve this purpose, an event study was performed, where the regulatory risk in a political nomination will be compared to a technical nomination, in terms of abnormal return.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This Master Thesis consists of one theoretical article and one empirical article on the field of Microeconometrics. The first chapter\footnote{We also thank useful suggestions by Marinho Bertanha, Gabriel Cepaluni, Brigham Frandsen, Dalia Ghanem, Ricardo Masini, Marcela Mello, Áureo de Paula, Cristine Pinto, Edson Severnini and seminar participants at São Paulo School of Economics, the California Econometrics Conference 2015 and the 37\textsuperscript{th} Brazilian Meeting of Econometrics.}, called \emph{Synthetic Control Estimator: A Generalized Inference Procedure and Confidence Sets}, contributes to the literature about inference techniques of the Synthetic Control Method. This methodology was proposed to answer questions involving counterfactuals when only one treated unit and a few control units are observed. Although this method was applied in many empirical works, the formal theory behind its inference procedure is still an open question. In order to fulfill this lacuna, we make clear the sufficient hypotheses that guarantee the adequacy of Fisher's Exact Hypothesis Testing Procedure for panel data, allowing us to test any \emph{sharp null hypothesis} and, consequently, to propose a new way to estimate Confidence Sets for the Synthetic Control Estimator by inverting a test statistic, the first confidence set when we have access only to finite sample, aggregate level data whose cross-sectional dimension may be larger than its time dimension. Moreover, we analyze the size and the power of the proposed test with a Monte Carlo experiment and find that test statistics that use the synthetic control method outperforms test statistics commonly used in the evaluation literature. We also extend our framework for the cases when we observe more than one outcome of interest (simultaneous hypothesis testing) or more than one treated unit (pooled intervention effect) and when heteroskedasticity is present. The second chapter, called \emph{Free Economic Area of Manaus: An Impact Evaluation using the Synthetic Control Method}, is an empirical article. We apply the synthetic control method for Brazilian city-level data during the 20\textsuperscript{th} Century in order to evaluate the economic impact of the Free Economic Area of Manaus (FEAM). We find that this enterprise zone had positive significant effects on Real GDP per capita and Services Total Production per capita, but it also had negative significant effects on Agriculture Total Production per capita. Our results suggest that this subsidy policy achieve its goal of promoting regional economic growth, even though it may have provoked mis-allocation of resources among economic sectors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The power-law size distributions obtained experimentally for neuronal avalanches are an important evidence of criticality in the brain. This evidence is supported by the fact that a critical branching process exhibits the same exponent t~3=2. Models at criticality have been employed to mimic avalanche propagation and explain the statistics observed experimentally. However, a crucial aspect of neuronal recordings has been almost completely neglected in the models: undersampling. While in a typical multielectrode array hundreds of neurons are recorded, in the same area of neuronal tissue tens of thousands of neurons can be found. Here we investigate the consequences of undersampling in models with three different topologies (two-dimensional, small-world and random network) and three different dynamical regimes (subcritical, critical and supercritical). We found that undersampling modifies avalanche size distributions, extinguishing the power laws observed in critical systems. Distributions from subcritical systems are also modified, but the shape of the undersampled distributions is more similar to that of a fully sampled system. Undersampled supercritical systems can recover the general characteristics of the fully sampled version, provided that enough neurons are measured. Undersampling in two-dimensional and small-world networks leads to similar effects, while the random network is insensitive to sampling density due to the lack of a well-defined neighborhood. We conjecture that neuronal avalanches recorded from local field potentials avoid undersampling effects due to the nature of this signal, but the same does not hold for spike avalanches. We conclude that undersampled branching-process-like models in these topologies fail to reproduce the statistics of spike avalanches.