928 resultados para Iterative Implementation Model
Resumo:
O objetivo deste estudo é propor a implementação de um modelo estatístico para cálculo da volatilidade, não difundido na literatura brasileira, o modelo de escala local (LSM), apresentando suas vantagens e desvantagens em relação aos modelos habitualmente utilizados para mensuração de risco. Para estimação dos parâmetros serão usadas as cotações diárias do Ibovespa, no período de janeiro de 2009 a dezembro de 2014, e para a aferição da acurácia empírica dos modelos serão realizados testes fora da amostra, comparando os VaR obtidos para o período de janeiro a dezembro de 2014. Foram introduzidas variáveis explicativas na tentativa de aprimorar os modelos e optou-se pelo correspondente americano do Ibovespa, o índice Dow Jones, por ter apresentado propriedades como: alta correlação, causalidade no sentido de Granger, e razão de log-verossimilhança significativa. Uma das inovações do modelo de escala local é não utilizar diretamente a variância, mas sim a sua recíproca, chamada de “precisão” da série, que segue uma espécie de passeio aleatório multiplicativo. O LSM captou todos os fatos estilizados das séries financeiras, e os resultados foram favoráveis a sua utilização, logo, o modelo torna-se uma alternativa de especificação eficiente e parcimoniosa para estimar e prever volatilidade, na medida em que possui apenas um parâmetro a ser estimado, o que representa uma mudança de paradigma em relação aos modelos de heterocedasticidade condicional.
Resumo:
This research is to be considered as an implementation of Goetzmann and Jorion (1999). In order to provide a more realistic scenario, we have implemented a Garch (1,1) approach for the residuals of returns and a multifactor model thus to better replicate the systematic risk of a market. The new simulations reveal some new aspects of emerging markets’ expected returns: the unpredictability of the emerging markets’ returns with the global factor does not depend on the year of emergence and that the unsystematic risk explains the returns of emerging markets for a much larger period of time. The results also reveal the high impact of Exchange rate, Commodities index and of the Global factor in emerging markets’ expected return.
Resumo:
Organizations are Complex systems. A conceptual model of the enterprise is needed that is: coherent the distinguished aspect models constitute a logical and truly integral comprehensive all relevant issues are covered consistent the aspect models are free from contradictions or irregularities concise no superfluous matters are contained in it essential it shows only the essence of the enterprise, i.e., the model abstracts from all realization and implementation issues. The world is in great need for transparency about the operation of all the systems we daily work with, ranging from the domestic appliances to the big societal institutions. In this context the field of enterprise ontology has emerged with the aim to create models that help to understand the essence of the construction and operation of complete systems; more specifically, of enterprises. Enterprise ontology arises in the way to look through the distracting and confusing appearance of an enterprise right into its deep kernel. This, from the perspective of the system designer gives him the tools needed to design a successful system in a way that’s reflects the desires and needs of the workers of the enterprise. This project’s context is the use of DEMO (Design and Engineering Methodology for Organizations) for (re)designing or (re)engineering of an enterprise, namely a process of the construction department of a city hall, the lack of a well-founded theory about the construction and operation of this processes that was the motivation behind this work. The purpose of studying applying the DEMO theory and method was to optimize the process, automating it as much as possible, while reducing paper and time spent between tasks and provide a better service to the citizens.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Much has been researched and discussed in the importance played by knowledge in organizations. We are witnessing the establishment of the knowledge economy, but this "new economy" brings in itself a whole complex system of metrics and evaluations, and cannot be dissociated from it. Due to its importance, the initiatives of knowledge management must be continually assessed on their progress in order to verify whether they are moving towards achieving the goals of success. Thus, good measurement practices should include not only how the organization quantifies its knowledge capital, but also how resources are allocated to supply their growth. Thinking about the aspects listed above, this paper presents an approach to a model for Knowledge extraction using an ERP system, suggesting the establishment of a set of indicators for assessing organizational performance. The objective is to evaluate the implementation of projects of knowledge management and thus observe the general development of the organization.
Resumo:
Researches in Requirements Engineering have been growing in the latest few years. Researchers are concerned with a set of open issues such as: communication between several user profiles involved in software engineering; scope definition; volatility and traceability issues. To cope with these issues a set of works are concentrated in (i) defining processes to collect client s specifications in order to solve scope issues; (ii) defining models to represent requirements to address communication and traceability issues; and (iii) working on mechanisms and processes to be applied to requirements modeling in order to facilitate requirements evolution and maintenance, addressing volatility and traceability issues. We propose an iterative Model-Driven process to solve these issues, based on a double layered CIM to communicate requirements related knowledge to a wider amount of stakeholders. We also present a tool to help requirements engineer through the RE process. Finally we present a case study to illustrate the process and tool s benefits and usage
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
We consider the management branch model where the random resources of the subsystem are given by the exponential distributions. The determinate equivalent is a block structure problem of quadratic programming. It is solved effectively by means of the decomposition method, which is based on iterative aggregation. The aggregation problem of the upper level is resolved analytically. This overcomes all difficulties concerning the large dimension of the main problem.
Resumo:
In this work we show that the implementation of spontaneous breaking of the lepton number in the 3-3-1 model with right-handed neutrinos gives rise to fast neutrino decay with Majoron emission and generates a bunch of new contributions to the neutrinoless double beta decay.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The code STATFLUX, implementing a new and simple statistical procedure for the calculation of transfer coefficients in radionuclide transport to animals and plants, is proposed. The method is based on the general multiple-compartment model, which uses a system of linear equations involving geometrical volume considerations. Flow parameters were estimated by employing two different least-squares procedures: Derivative and Gauss-Marquardt methods, with the available experimental data of radionuclide concentrations as the input functions of time. The solution of the inverse problem, which relates a given set of flow parameter with the time evolution of concentration functions, is achieved via a Monte Carlo Simulation procedure.Program summaryTitle of program: STATFLUXCatalogue identifier: ADYS_v1_0Program summary URL: http://cpc.cs.qub.ac.uk/summaries/ADYS_v1_0Program obtainable from: CPC Program Library, Queen's University of Belfast, N. IrelandLicensing provisions: noneComputer for which the program is designed and others on which it has been tested: Micro-computer with Intel Pentium III, 3.0 GHzInstallation: Laboratory of Linear Accelerator, Department of Experimental Physics, University of São Paulo, BrazilOperating system: Windows 2000 and Windows XPProgramming language used: Fortran-77 as implemented in Microsoft Fortran 4.0. NOTE: Microsoft Fortran includes non-standard features which are used in this program. Standard Fortran compilers such as, g77, f77, ifort and NAG95, are not able to compile the code and therefore it has not been possible for the CPC Program Library to test the program.Memory, required to execute with typical data: 8 Mbytes of RAM memory and 100 MB of Hard disk memoryNo. of bits in a word: 16No. of lines in distributed program, including test data, etc.: 6912No. of bytes in distributed Program, including test data, etc.: 229 541Distribution format: tar.gzNature of the physical problem: the investigation of transport mechanisms for radioactive substances, through environmental pathways, is very important for radiological protection of populations. One such pathway, associated with the food chain, is the grass-animal-man sequence. The distribution of trace elements in humans and laboratory animals has been intensively studied over the past 60 years [R.C. Pendlenton, C.W. Mays, R.D. Lloyd, A.L. Brooks, Differential accumulation of iodine-131 from local fallout in people and milk, Health Phys. 9 (1963) 1253-1262]. In addition, investigations on the incidence of cancer in humans, and a possible causal relationship to radioactive fallout, have been undertaken [E.S. Weiss, M.L. Rallison, W.T. London, W.T. Carlyle Thompson, Thyroid nodularity in southwestern Utah school children exposed to fallout radiation, Amer. J. Public Health 61 (1971) 241-249; M.L. Rallison, B.M. Dobyns, F.R. Keating, J.E. Rall, F.H. Tyler, Thyroid diseases in children, Amer. J. Med. 56 (1974) 457-463; J.L. Lyon, M.R. Klauber, J.W. Gardner, K.S. Udall, Childhood leukemia associated with fallout from nuclear testing, N. Engl. J. Med. 300 (1979) 397-402]. From the pathways of entry of radionuclides in the human (or animal) body, ingestion is the most important because it is closely related to life-long alimentary (or dietary) habits. Those radionuclides which are able to enter the living cells by either metabolic or other processes give rise to localized doses which can be very high. The evaluation of these internally localized doses is of paramount importance for the assessment of radiobiological risks and radiological protection. The time behavior of trace concentration in organs is the principal input for prediction of internal doses after acute or chronic exposure. The General Multiple-Compartment Model (GMCM) is the powerful and more accepted method for biokinetical studies, which allows the calculation of concentration of trace elements in organs as a function of time, when the flow parameters of the model are known. However, few biokinetics data exist in the literature, and the determination of flow and transfer parameters by statistical fitting for each system is an open problem.Restriction on the complexity of the problem: This version of the code works with the constant volume approximation, which is valid for many situations where the biological half-live of a trace is lower than the volume rise time. Another restriction is related to the central flux model. The model considered in the code assumes that exist one central compartment (e.g., blood), that connect the flow with all compartments, and the flow between other compartments is not included.Typical running time: Depends on the choice for calculations. Using the Derivative Method the time is very short (a few minutes) for any number of compartments considered. When the Gauss-Marquardt iterative method is used the calculation time can be approximately 5-6 hours when similar to 15 compartments are considered. (C) 2006 Elsevier B.V. All rights reserved.
Resumo:
An economic model including the labor resource and the process stage configuration is proposed to design g charts allowing for all the design parameters to be varied in an adaptive way. A random shift size is considered during the economic design selection. The results obtained for a benchmark of 64 process stage scenarios show that the activities configuration and some process operating parameters influence the selection of the best control chart strategy: to model the random shift size, its exact distribution can be approximately fitted by a discrete distribution obtained from a relatively small sample of historical data. However, an accurate estimation of the inspection costs associated to the SPC activities is far from being achieved. An illustrative example shows the implementation of the proposed economic model in a real industrial case. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This paper discusses the application of a damage detection methodology to monitor the location and extent of partial structural damage. The methodology combines, in an iterative way, the model updating technique based on frequency response functions (FRF) with monitoring data aiming at identifying the damage area of the structure. After the updating procedure reaches a good correlation between the models, it compares the parameters of the damage structure with those of the undamaged one to find the deteriorated area. The influence of the FEM mesh size on the evaluation of the extent of the damage has also been discussed. The methodology is applied using real experimental data from a spatial frame structure.
Resumo:
An approach using straight lines as features to solve the photogrammetric space resection problem is presented. An explicit mathematical model relating straight lines, in both object and image space, is used. Based on this model, Kalman Filtering is applied to solve the space resection problem. The recursive property of the filter is used in an iterative process which uses the sequentially estimated camera location parameters to feedback to the feature extraction process in the image. This feedback process leads to a gradual reduction of the image space for feature searching, and consequently eliminates the bottleneck due to the high computational cost of the image segmentation phase. It also enables feature extraction and the determination of feature correspondence in image and object space in an automatic way, i.e., without operator interference. Results obtained from simulated and real data show that highly accurate space resection parameters are obtained as well as a progressive processing time reduction. The obtained accuracy, the automatic correspondence process, and the short related processing time show that the proposed approach can be used in many real-time machine vision systems, making possible the implementation of applications not feasible until now.
Resumo:
In this article, an implementation of structural health monitoring process automation based on vibration measurements is proposed. The work presents an alternative approach which intent is to exploit the capability of model updating techniques associated to neural networks to be used in a process of automation of fault detection. The updating procedure supplies a reliable model which permits to simulate any damage condition in order to establish direct correlation between faults and deviation in the response of the model. The ability of the neural networks to recognize, at known signature, changes in the actual data of a model in real time are explored to investigate changes of the actual operation conditions of the system. The learning of the network is performed using a compressed spectrum signal created for each specific type of fault. Different fault conditions for a frame structure are evaluated using simulated data as well as measured experimental data.