976 resultados para Unified Model Reference
Resumo:
This research has responded to the need for diagnostic reference tools explicitly linking the influence of environmental uncertainty and performance within the supply chain. Uncertainty is a key factor influencing performance and an important measure of the operating environment. We develop and demonstrate a novel reference methodology based on data envelopment analysis (DEA) for examining the performance of value streams within the supply chain with specific reference to the level of environmental uncertainty they face. In this paper, using real industrial data, 20 product supply value streams within the European automotive industry sector are evaluated. Two are found to be efficient. The peer reference groups for the underperforming value streams are identified and numerical improvement targets are derived. The paper demonstrates how DEA can be used to guide supply chain improvement efforts through role-model identification and target setting, in a way that recognises the multiple dimensions/outcomes of the supply chain process and the influence of its environmental conditions. We have facilitated the contextualisation of environmental uncertainty and its incorporation into a specific diagnostic reference tool.
Resumo:
This paper reviews nine software packages with particular reference to their GARCH model estimation accuracy when judged against a respected benchmark. We consider the numerical consistency of GARCH and EGARCH estimation and forecasting. Our results have a number of implications for published research and future software development. Finally, we argue that the establishment of benchmarks for other standard non-linear models is long overdue.
Resumo:
This paper seeks to examine the particular operations of gender and cultural politics that both shaped and restrained possible 'networked' interactions between Jamaican women and their British 'motherlands' during the first forty years of the twentieth century. Paying particular attention to the poetry of Albinia Catherine MacKay (a Scots Creole) and the political journalism of Una Marson (a black Jamaica), I shall seek to examine why both writers speak in and of voices out of place. MacKay's poems work against the critical pull of transnational modernism to reveal aesthetic and cultural isolation through a model of strained belonging in relation to both her Jamaica home and an ancestral Scotland. A small number of poems from her 1912 collection that are dedicated to the historical struggle between the English and Scots for the rule of Scotland and cultural self-determination, some of which are written in a Scottish idiom, may help us to read the complex cultural negotiations that silently inform the seemingly in commensurability of location and locution revealed in these works. In contrast, Marson's journalism, although less known even than her creative writings, is both politically and intellectually radical in its arguments concerning the mutual articulation of race and gender empowerment. However, Marson remains aware of her inability to articulate these convictions with force in a British context and thereby of the way in which speaking out of place also silences her.
Resumo:
Our digital universe is rapidly expanding,more and more daily activities are digitally recorded, data arrives in streams, it needs to be analyzed in real time and may evolve over time. In the last decade many adaptive learning algorithms and prediction systems, which can automatically update themselves with the new incoming data, have been developed. The majority of those algorithms focus on improving the predictive performance and assume that model update is always desired as soon as possible and as frequently as possible. In this study we consider potential model update as an investment decision, which, as in the financial markets, should be taken only if a certain return on investment is expected. We introduce and motivate a new research problem for data streams ? cost-sensitive adaptation. We propose a reference framework for analyzing adaptation strategies in terms of costs and benefits. Our framework allows to characterize and decompose the costs of model updates, and to asses and interpret the gains in performance due to model adaptation for a given learning algorithm on a given prediction task. Our proof-of-concept experiment demonstrates how the framework can aid in analyzing and managing adaptation decisions in the chemical industry.
Resumo:
Current state-of-the-art global climate models produce different values for Earth’s mean temperature. When comparing simulations with each other and with observations it is standard practice to compare temperature anomalies with respect to a reference period. It is not always appreciated that the choice of reference period can affect conclusions, both about the skill of simulations of past climate, and about the magnitude of expected future changes in climate. For example, observed global temperatures over the past decade are towards the lower end of the range of CMIP5 simulations irrespective of what reference period is used, but exactly where they lie in the model distribution varies with the choice of reference period. Additionally, we demonstrate that projections of when particular temperature levels are reached, for example 2K above ‘pre-industrial’, change by up to a decade depending on the choice of reference period. In this article we discuss some of the key issues that arise when using anomalies relative to a reference period to generate climate projections. We highlight that there is no perfect choice of reference period. When evaluating models against observations, a long reference period should generally be used, but how long depends on the quality of the observations available. The IPCC AR5 choice to use a 1986-2005 reference period for future global temperature projections was reasonable, but a case-by-case approach is needed for different purposes and when assessing projections of different climate variables. Finally, we recommend that any studies that involve the use of a reference period should explicitly examine the robustness of the conclusions to alternative choices.
Resumo:
A new formal approach for representation of polarization states of coherent and partially coherent electromagnetic plane waves is presented. Its basis is a purely geometric construction for the normalised complex-analytic coherent wave as a generating line in the sphere of wave directions, and whose Stokes vector is determined by the intersection with the conjugate generating line. The Poincare sphere is now located in physical space, simply a coordination of the wave sphere, its axis aligned with the wave vector. Algebraically, the generators representing coherent states are represented by spinors, and this is made consistent with the spinor-tensor representation of electromagnetic theory by means of an explicit reference spinor we call the phase flag. As a faithful unified geometric representation, the new model provides improved formal tools for resolving many of the geometric difficulties and ambiguities that arise in the traditional formalism.
Resumo:
The impact of extreme sea ice initial conditions on modelled climate is analysed for a fully coupled atmosphere ocean sea ice general circulation model, the Hadley Centre climate model HadCM3. A control run is chosen as reference experiment with greenhouse gas concentration fixed at preindustrial conditions. Sensitivity experiments show an almost complete recovery from total removal or strong increase of sea ice after four years. Thus, uncertainties in initial sea ice conditions seem to be unimportant for climate modelling on decadal or longer time scales. When the initial conditions of the ocean mixed layer were adjusted to ice-free conditions, a few substantial differences remained for more than 15 model years. But these differences are clearly smaller than the uncertainty of the HadCM3 run and all the other 19 IPCC fourth assessment report climate model preindustrial runs. It is an important task to improve climate models in simulating the past sea ice variability to enable them to make reliable projections for the 21st century.
Resumo:
In this paper, we formulate a flexible density function from the selection mechanism viewpoint (see, for example, Bayarri and DeGroot (1992) and Arellano-Valle et al. (2006)) which possesses nice biological and physical interpretations. The new density function contains as special cases many models that have been proposed recently in the literature. In constructing this model, we assume that the number of competing causes of the event of interest has a general discrete distribution characterized by its probability generating function. This function has an important role in the selection procedure as well as in computing the conditional personal cure rate. Finally, we illustrate how various models can be deduced as special cases of the proposed model. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Security administrators face the challenge of designing, deploying and maintaining a variety of configuration files related to security systems, especially in large-scale networks. These files have heterogeneous syntaxes and follow differing semantic concepts. Nevertheless, they are interdependent due to security services having to cooperate and their configuration to be consistent with each other, so that global security policies are completely and correctly enforced. To tackle this problem, our approach supports a comfortable definition of an abstract high-level security policy and provides an automated derivation of the desired configuration files. It is an extension of policy-based management and policy hierarchies, combining model-based management (MBM) with system modularization. MBM employs an object-oriented model of the managed system to obtain the details needed for automated policy refinement. The modularization into abstract subsystems (ASs) segment the system-and the model-into units which more closely encapsulate related system components and provide focused abstract views. As a result, scalability is achieved and even comprehensive IT systems can be modelled in a unified manner. The associated tool MoBaSeC (Model-Based-Service-Configuration) supports interactive graphical modelling, automated model analysis and policy refinement with the derivation of configuration files. We describe the MBM and AS approaches, outline the tool functions and exemplify their applications and results obtained. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
The Rio Apa cratonic fragment crops out in Mato Grosso do Sul State of Brazil and in northeastern Paraguay. It comprises Paleo-Mesoproterozoic medium grade metamorphic rocks, intruded by granitic rocks, and is covered by the Neoproterozoic deposits of the Corumbi and Itapocurni Groups. Eastward it is bound by the southern portion of the Paraguay belt. In this work, more than 100 isotopic determinations, including U-Pb SHRIMP zircon ages, Rb-Sr and Sm-Nd whole-rock determinations, as well as K-Ar and Ar-Ar mineral ages, were reassessed in order to obtain a complete picture of its regional geological history. The tectonic evolution of the Rio Apa Craton starts with the formation of a series of magmatic arc complexes. The oldest U-Pb SHRIMP zircon age comes from a banded gneiss collected in the northern part of the region, with an age of 1950 +/- 23 Ma. The large granitic intrusion of the Alumiador Batholith yielded a U-Pb zircon age of 1839 +/- 33 Ma, and from the southeastern part of the area two orthogneisses gave zircon U-Pb ages of 1774 +/- 26 Ma and 1721 +/- 25 Ma. These may be coeval with the Alto Terere metamorphic rocks of the northeastern corner, intruded in their turn by the Baia das Garcas granitic rocks, one of them yielding a zircon U-Pb age of 1754 +/- 49 Ma. The original magmatic protoliths of these rocks involved some crustal component, as indicated by the Sm-Nd TDm model ages, between 1.9 and 2.5 Ga. Regional Sr isotopic homogenization, associated with tectonic deformation and medium-grade metamorphism occurred at approximately 1670 Ma, as suggested by Rb-Sr whole rock reference isochrons. Finally, at 1300 Ma ago, the Ar work indicates that the Rio Apa Craton was affected by widespread regional heating, when the temperature probably exceeded 350 degrees C. Geographic distribution, age and isotopic signature of the fithotectonic units suggest the existence of a major suture separating two different tectonic domains, juxtaposed at about 1670 Ma. From that time on, the unified Rio Apa continental block behaved as one coherent and stable tectonic unit. It correlates well with the SW corner of the Amazonian Craton, where the medium-grade rocks of the Juruena-Rio Negro tectonic province, with ages between 1600 and 1780 Ma, were reworked at about 1300 Ma. Looking at the largest scale, the Rio Apa Craton is probably attached to the larger Amazonian Craton, and the actual configuration of southwestern South America is possibly due to a complex arrangement of allochthonous blocks such as the Arequipa, Antofalla and Pampia, with different sizes, that may have originated as disrupted parts of either Laurentia or Amazonia, and were trapped during later collisions of these continental masses.
Resumo:
Prediction of random effects is an important problem with expanding applications. In the simplest context, the problem corresponds to prediction of the latent value (the mean) of a realized cluster selected via two-stage sampling. Recently, Stanek and Singer [Predicting random effects from finite population clustered samples with response error. J. Amer. Statist. Assoc. 99, 119-130] developed best linear unbiased predictors (BLUP) under a finite population mixed model that outperform BLUPs from mixed models and superpopulation models. Their setup, however, does not allow for unequally sized clusters. To overcome this drawback, we consider an expanded finite population mixed model based on a larger set of random variables that span a higher dimensional space than those typically applied to such problems. We show that BLUPs for linear combinations of the realized cluster means derived under such a model have considerably smaller mean squared error (MSE) than those obtained from mixed models, superpopulation models, and finite population mixed models. We motivate our general approach by an example developed for two-stage cluster sampling and show that it faithfully captures the stochastic aspects of sampling in the problem. We also consider simulation studies to illustrate the increased accuracy of the BLUP obtained under the expanded finite population mixed model. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Apesar de o CMMI (Capability Maturity Model Integration) prover uma cobertura mais detalhada do ciclo de vida dos produtos que o uso isolado de outros processos de melhoria, ainda assim não pode ser visto como uma metodologia pronta para ser utilizada pelas organizações. Cada organização deve mapear as áreas de processo do nível CMMI desejado (caso a opção seja a representação por estágios, como veremos adiante) à sua metodologia, aos seus métodos e técnicas de desenvolvimento de produtos e sistemas, levando também em consideração os objetivos de negócio da organização. Tanto o CMMI como as demais normas e modelos de qualidade, dizem “o que” e não “como” fazer. Determinar este “como” é um trabalho adicional bastante grande que as organizações devem realizar quando da adoção destas normas. Para isto, normalmente as organizações buscam no mercado empresas de consultoria que possuem experiência na área. Essas consultorias são bastante indicadas porque aumentam significativamente a qualidade e a velocidade dos resultados. Este trabalho pretende facilitar as organizações interessadas na implementação de um modelo de qualidade, fornecendo descrições sobre alguns dos modelos de qualidade mais utilizados atualmente, bem como modelos de processos, guias e formulários que podem ser utilizados diretamente ou como uma base para a implementação do modelo desejado. Embora se aplique à implementação de qualquer modelo de qualidade, mais especificamente, este trabalho destina-se a auxiliar organizações que visem implementar o modelo CMMI nível 2 (doravante usaremos também a abreviação CMMI-N2). Para tanto, descreve de forma mais detalhada este modelo e fornece um caminho para a implementação do mesmo, incluindo a descrição de um processo mínimo de desenvolvimento de software, com base no RUP (Rational Unified Process) e o uso de um modelo de ciclo de vida de melhoria de processos, o IDEAL. Neste trabalho, propõe-se que seja utilizado o modelo IDEAL para a implementação do modelo de qualidade devido ao fato de este modelo ter sido concebido originalmente como um modelo de ciclo de vida para melhoria de processos de software com base no SW-CMM (Capability Maturity Model for Software). Associado a esse modelo, é sugerido que se utilize algumas técnicas e processos de gerência de projetos para cada área de processo do CMMI-N2, visando implantar cada área de processo através de um projeto. Para a implementação são propostos guias, modelos (formulários) de implementação e uma tabela que mapeia todas as áreas de processo do CMMI-N2, seus objetivos, práticas, produtos de trabalho e as ferramentas associadas.
Resumo:
This paper discusses distribution and the historical phases of capitalism. It assumes that technical progress and growth are taking place, and, given that, its question is on the functional distribution of income between labor and capital, having as reference classical theory of distribution and Marx’s falling tendency of the rate of profit. Based on the historical experience, it, first, inverts the model, making the rate of profit as the constant variable in the long run and the wage rate, as the residuum; second, it distinguishes three types of technical progress (capital-saving, neutral and capital-using) and applies it to the history of capitalism, having the UK and France as reference. Given these three types of technical progress, it distinguishes four phases of capitalist growth, where only the second is consistent with Marx prediction. The last phase, after World War II, should be, in principle, capital-saving, consistent with growth of wages above productivity. Instead, since the 1970s wages were kept stagnant in rich countries because of, first, the fact that the Information and Communication Technology Revolution proved to be highly capital using, opening room for a new wage of substitution of capital for labor; second, the new competition coming from developing countries; third, the emergence of the technobureaucratic or professional class; and, fourth, the new power of the neoliberal class coalition associating rentier capitalists and financiers
Resumo:
We study an intertemporal asset pricing model in which a representative consumer maximizes expected utility derived from both the ratio of his consumption to some reference level and this level itself. If the reference consumption level is assumed to be determined by past consumption levels, the model generalizes the usual habit formation specifications. When the reference level growth rate is made dependent on the market portfolio return and on past consumption growth, the model mixes a consumption CAPM with habit formation together with the CAPM. It therefore provides, in an expected utility framework, a generalization of the non-expected recursive utility model of Epstein and Zin (1989). When we estimate this specification with aggregate per capita consumption, we obtain economically plausible values of the preference parameters, in contrast with the habit formation or the Epstein-Zin cases taken separately. All tests performed with various preference specifications confirm that the reference level enters significantly in the pricing kernel.
Resumo:
OSAN, R. , TORT, A. B. L. , AMARAL, O. B. . A mismatch-based model for memory reconsolidation and extinction in attractor networks. Plos One, v. 6, p. e23113, 2011.