894 resultados para Unified Model Reference
Resumo:
Remote sensing observations often have correlated errors, but the correlations are typically ignored in data assimilation for numerical weather prediction. The assumption of zero correlations is often used with data thinning methods, resulting in a loss of information. As operational centres move towards higher-resolution forecasting, there is a requirement to retain data providing detail on appropriate scales. Thus an alternative approach to dealing with observation error correlations is needed. In this article, we consider several approaches to approximating observation error correlation matrices: diagonal approximations, eigendecomposition approximations and Markov matrices. These approximations are applied in incremental variational assimilation experiments with a 1-D shallow water model using synthetic observations. Our experiments quantify analysis accuracy in comparison with a reference or ‘truth’ trajectory, as well as with analyses using the ‘true’ observation error covariance matrix. We show that it is often better to include an approximate correlation structure in the observation error covariance matrix than to incorrectly assume error independence. Furthermore, by choosing a suitable matrix approximation, it is feasible and computationally cheap to include error correlation structure in a variational data assimilation algorithm.
Resumo:
Sea ice contains flaws including frictional contacts. We aim to describe quantitatively the mechanics of those contacts, providing local physics for geophysical models. With a focus on the internal friction of ice, we review standard micro-mechanical models of friction. The solid's deformation under normal load may be ductile or elastic. The shear failure of the contact may be by ductile flow, brittle fracture, or melting and hydrodynamic lubrication. Combinations of these give a total of six rheological models. When the material under study is ice, several of the rheological parameters in the standard models are not constant, but depend on the temperature of the bulk, on the normal stress under which samples are pressed together, or on the sliding velocity and acceleration. This has the effect of making the shear stress required for sliding dependent on sliding velocity, acceleration, and temperature. In some cases, it also perturbs the exponent in the normal-stress dependence of that shear stress away from the value that applies to most materials. We unify the models by a principle of maximum displacement for normal deformation, and of minimum stress for shear failure, reducing the controversy over the mechanism of internal friction in ice to the choice of values of four parameters in a single model. The four parameters represent, for a typical asperity contact, the sliding distance required to expel melt-water, the sliding distance required to break contact, the normal strain in the asperity, and the thickness of any ductile shear zone.
Resumo:
This paper presents single-column model (SCM) simulations of a tropical squall-line case observed during the Coupled Ocean-Atmosphere Response Experiment of the Tropical Ocean/Global Atmosphere Programme. This case-study was part of an international model intercomparison project organized by Working Group 4 ‘Precipitating Convective Cloud Systems’ of the GEWEX (Global Energy and Water-cycle Experiment) Cloud System Study. Eight SCM groups using different deep-convection parametrizations participated in this project. The SCMs were forced by temperature and moisture tendencies that had been computed from a reference cloud-resolving model (CRM) simulation using open boundary conditions. The comparison of the SCM results with the reference CRM simulation provided insight into the ability of current convection and cloud schemes to represent organized convection. The CRM results enabled a detailed evaluation of the SCMs in terms of the thermodynamic structure and the convective mass flux of the system, the latter being closely related to the surface convective precipitation. It is shown that the SCMs could reproduce reasonably well the time evolution of the surface convective and stratiform precipitation, the convective mass flux, and the thermodynamic structure of the squall-line system. The thermodynamic structure simulated by the SCMs depended on how the models partitioned the precipitation between convective and stratiform. However, structural differences persisted in the thermodynamic profiles simulated by the SCMs and the CRM. These differences could be attributed to the fact that the total mass flux used to compute the SCM forcing differed from the convective mass flux. The SCMs could not adequately represent these organized mesoscale circulations and the microphysicallradiative forcing associated with the stratiform region. This issue is generally known as the ‘scale-interaction’ problem that can only be properly addressed in fully three-dimensional simulations. Sensitivity simulations run by several groups showed that the time evolution of the surface convective precipitation was considerably smoothed when the convective closure was based on convective available potential energy instead of moisture convergence. Finally, additional SCM simulations without using a convection parametrization indicated that the impact of a convection parametrization in forced SCM runs was more visible in the moisture profiles than in the temperature profiles because convective transport was particularly important in the moisture budget.
Resumo:
This research has responded to the need for diagnostic reference tools explicitly linking the influence of environmental uncertainty and performance within the supply chain. Uncertainty is a key factor influencing performance and an important measure of the operating environment. We develop and demonstrate a novel reference methodology based on data envelopment analysis (DEA) for examining the performance of value streams within the supply chain with specific reference to the level of environmental uncertainty they face. In this paper, using real industrial data, 20 product supply value streams within the European automotive industry sector are evaluated. Two are found to be efficient. The peer reference groups for the underperforming value streams are identified and numerical improvement targets are derived. The paper demonstrates how DEA can be used to guide supply chain improvement efforts through role-model identification and target setting, in a way that recognises the multiple dimensions/outcomes of the supply chain process and the influence of its environmental conditions. We have facilitated the contextualisation of environmental uncertainty and its incorporation into a specific diagnostic reference tool.
Resumo:
This paper reviews nine software packages with particular reference to their GARCH model estimation accuracy when judged against a respected benchmark. We consider the numerical consistency of GARCH and EGARCH estimation and forecasting. Our results have a number of implications for published research and future software development. Finally, we argue that the establishment of benchmarks for other standard non-linear models is long overdue.
Resumo:
This paper seeks to examine the particular operations of gender and cultural politics that both shaped and restrained possible 'networked' interactions between Jamaican women and their British 'motherlands' during the first forty years of the twentieth century. Paying particular attention to the poetry of Albinia Catherine MacKay (a Scots Creole) and the political journalism of Una Marson (a black Jamaica), I shall seek to examine why both writers speak in and of voices out of place. MacKay's poems work against the critical pull of transnational modernism to reveal aesthetic and cultural isolation through a model of strained belonging in relation to both her Jamaica home and an ancestral Scotland. A small number of poems from her 1912 collection that are dedicated to the historical struggle between the English and Scots for the rule of Scotland and cultural self-determination, some of which are written in a Scottish idiom, may help us to read the complex cultural negotiations that silently inform the seemingly in commensurability of location and locution revealed in these works. In contrast, Marson's journalism, although less known even than her creative writings, is both politically and intellectually radical in its arguments concerning the mutual articulation of race and gender empowerment. However, Marson remains aware of her inability to articulate these convictions with force in a British context and thereby of the way in which speaking out of place also silences her.
Resumo:
Our digital universe is rapidly expanding,more and more daily activities are digitally recorded, data arrives in streams, it needs to be analyzed in real time and may evolve over time. In the last decade many adaptive learning algorithms and prediction systems, which can automatically update themselves with the new incoming data, have been developed. The majority of those algorithms focus on improving the predictive performance and assume that model update is always desired as soon as possible and as frequently as possible. In this study we consider potential model update as an investment decision, which, as in the financial markets, should be taken only if a certain return on investment is expected. We introduce and motivate a new research problem for data streams ? cost-sensitive adaptation. We propose a reference framework for analyzing adaptation strategies in terms of costs and benefits. Our framework allows to characterize and decompose the costs of model updates, and to asses and interpret the gains in performance due to model adaptation for a given learning algorithm on a given prediction task. Our proof-of-concept experiment demonstrates how the framework can aid in analyzing and managing adaptation decisions in the chemical industry.
Resumo:
Current state-of-the-art global climate models produce different values for Earth’s mean temperature. When comparing simulations with each other and with observations it is standard practice to compare temperature anomalies with respect to a reference period. It is not always appreciated that the choice of reference period can affect conclusions, both about the skill of simulations of past climate, and about the magnitude of expected future changes in climate. For example, observed global temperatures over the past decade are towards the lower end of the range of CMIP5 simulations irrespective of what reference period is used, but exactly where they lie in the model distribution varies with the choice of reference period. Additionally, we demonstrate that projections of when particular temperature levels are reached, for example 2K above ‘pre-industrial’, change by up to a decade depending on the choice of reference period. In this article we discuss some of the key issues that arise when using anomalies relative to a reference period to generate climate projections. We highlight that there is no perfect choice of reference period. When evaluating models against observations, a long reference period should generally be used, but how long depends on the quality of the observations available. The IPCC AR5 choice to use a 1986-2005 reference period for future global temperature projections was reasonable, but a case-by-case approach is needed for different purposes and when assessing projections of different climate variables. Finally, we recommend that any studies that involve the use of a reference period should explicitly examine the robustness of the conclusions to alternative choices.
Resumo:
A new formal approach for representation of polarization states of coherent and partially coherent electromagnetic plane waves is presented. Its basis is a purely geometric construction for the normalised complex-analytic coherent wave as a generating line in the sphere of wave directions, and whose Stokes vector is determined by the intersection with the conjugate generating line. The Poincare sphere is now located in physical space, simply a coordination of the wave sphere, its axis aligned with the wave vector. Algebraically, the generators representing coherent states are represented by spinors, and this is made consistent with the spinor-tensor representation of electromagnetic theory by means of an explicit reference spinor we call the phase flag. As a faithful unified geometric representation, the new model provides improved formal tools for resolving many of the geometric difficulties and ambiguities that arise in the traditional formalism.
Resumo:
The impact of extreme sea ice initial conditions on modelled climate is analysed for a fully coupled atmosphere ocean sea ice general circulation model, the Hadley Centre climate model HadCM3. A control run is chosen as reference experiment with greenhouse gas concentration fixed at preindustrial conditions. Sensitivity experiments show an almost complete recovery from total removal or strong increase of sea ice after four years. Thus, uncertainties in initial sea ice conditions seem to be unimportant for climate modelling on decadal or longer time scales. When the initial conditions of the ocean mixed layer were adjusted to ice-free conditions, a few substantial differences remained for more than 15 model years. But these differences are clearly smaller than the uncertainty of the HadCM3 run and all the other 19 IPCC fourth assessment report climate model preindustrial runs. It is an important task to improve climate models in simulating the past sea ice variability to enable them to make reliable projections for the 21st century.
Resumo:
In this paper, we formulate a flexible density function from the selection mechanism viewpoint (see, for example, Bayarri and DeGroot (1992) and Arellano-Valle et al. (2006)) which possesses nice biological and physical interpretations. The new density function contains as special cases many models that have been proposed recently in the literature. In constructing this model, we assume that the number of competing causes of the event of interest has a general discrete distribution characterized by its probability generating function. This function has an important role in the selection procedure as well as in computing the conditional personal cure rate. Finally, we illustrate how various models can be deduced as special cases of the proposed model. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Security administrators face the challenge of designing, deploying and maintaining a variety of configuration files related to security systems, especially in large-scale networks. These files have heterogeneous syntaxes and follow differing semantic concepts. Nevertheless, they are interdependent due to security services having to cooperate and their configuration to be consistent with each other, so that global security policies are completely and correctly enforced. To tackle this problem, our approach supports a comfortable definition of an abstract high-level security policy and provides an automated derivation of the desired configuration files. It is an extension of policy-based management and policy hierarchies, combining model-based management (MBM) with system modularization. MBM employs an object-oriented model of the managed system to obtain the details needed for automated policy refinement. The modularization into abstract subsystems (ASs) segment the system-and the model-into units which more closely encapsulate related system components and provide focused abstract views. As a result, scalability is achieved and even comprehensive IT systems can be modelled in a unified manner. The associated tool MoBaSeC (Model-Based-Service-Configuration) supports interactive graphical modelling, automated model analysis and policy refinement with the derivation of configuration files. We describe the MBM and AS approaches, outline the tool functions and exemplify their applications and results obtained. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
The Rio Apa cratonic fragment crops out in Mato Grosso do Sul State of Brazil and in northeastern Paraguay. It comprises Paleo-Mesoproterozoic medium grade metamorphic rocks, intruded by granitic rocks, and is covered by the Neoproterozoic deposits of the Corumbi and Itapocurni Groups. Eastward it is bound by the southern portion of the Paraguay belt. In this work, more than 100 isotopic determinations, including U-Pb SHRIMP zircon ages, Rb-Sr and Sm-Nd whole-rock determinations, as well as K-Ar and Ar-Ar mineral ages, were reassessed in order to obtain a complete picture of its regional geological history. The tectonic evolution of the Rio Apa Craton starts with the formation of a series of magmatic arc complexes. The oldest U-Pb SHRIMP zircon age comes from a banded gneiss collected in the northern part of the region, with an age of 1950 +/- 23 Ma. The large granitic intrusion of the Alumiador Batholith yielded a U-Pb zircon age of 1839 +/- 33 Ma, and from the southeastern part of the area two orthogneisses gave zircon U-Pb ages of 1774 +/- 26 Ma and 1721 +/- 25 Ma. These may be coeval with the Alto Terere metamorphic rocks of the northeastern corner, intruded in their turn by the Baia das Garcas granitic rocks, one of them yielding a zircon U-Pb age of 1754 +/- 49 Ma. The original magmatic protoliths of these rocks involved some crustal component, as indicated by the Sm-Nd TDm model ages, between 1.9 and 2.5 Ga. Regional Sr isotopic homogenization, associated with tectonic deformation and medium-grade metamorphism occurred at approximately 1670 Ma, as suggested by Rb-Sr whole rock reference isochrons. Finally, at 1300 Ma ago, the Ar work indicates that the Rio Apa Craton was affected by widespread regional heating, when the temperature probably exceeded 350 degrees C. Geographic distribution, age and isotopic signature of the fithotectonic units suggest the existence of a major suture separating two different tectonic domains, juxtaposed at about 1670 Ma. From that time on, the unified Rio Apa continental block behaved as one coherent and stable tectonic unit. It correlates well with the SW corner of the Amazonian Craton, where the medium-grade rocks of the Juruena-Rio Negro tectonic province, with ages between 1600 and 1780 Ma, were reworked at about 1300 Ma. Looking at the largest scale, the Rio Apa Craton is probably attached to the larger Amazonian Craton, and the actual configuration of southwestern South America is possibly due to a complex arrangement of allochthonous blocks such as the Arequipa, Antofalla and Pampia, with different sizes, that may have originated as disrupted parts of either Laurentia or Amazonia, and were trapped during later collisions of these continental masses.
Resumo:
Prediction of random effects is an important problem with expanding applications. In the simplest context, the problem corresponds to prediction of the latent value (the mean) of a realized cluster selected via two-stage sampling. Recently, Stanek and Singer [Predicting random effects from finite population clustered samples with response error. J. Amer. Statist. Assoc. 99, 119-130] developed best linear unbiased predictors (BLUP) under a finite population mixed model that outperform BLUPs from mixed models and superpopulation models. Their setup, however, does not allow for unequally sized clusters. To overcome this drawback, we consider an expanded finite population mixed model based on a larger set of random variables that span a higher dimensional space than those typically applied to such problems. We show that BLUPs for linear combinations of the realized cluster means derived under such a model have considerably smaller mean squared error (MSE) than those obtained from mixed models, superpopulation models, and finite population mixed models. We motivate our general approach by an example developed for two-stage cluster sampling and show that it faithfully captures the stochastic aspects of sampling in the problem. We also consider simulation studies to illustrate the increased accuracy of the BLUP obtained under the expanded finite population mixed model. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Apesar de o CMMI (Capability Maturity Model Integration) prover uma cobertura mais detalhada do ciclo de vida dos produtos que o uso isolado de outros processos de melhoria, ainda assim não pode ser visto como uma metodologia pronta para ser utilizada pelas organizações. Cada organização deve mapear as áreas de processo do nível CMMI desejado (caso a opção seja a representação por estágios, como veremos adiante) à sua metodologia, aos seus métodos e técnicas de desenvolvimento de produtos e sistemas, levando também em consideração os objetivos de negócio da organização. Tanto o CMMI como as demais normas e modelos de qualidade, dizem “o que” e não “como” fazer. Determinar este “como” é um trabalho adicional bastante grande que as organizações devem realizar quando da adoção destas normas. Para isto, normalmente as organizações buscam no mercado empresas de consultoria que possuem experiência na área. Essas consultorias são bastante indicadas porque aumentam significativamente a qualidade e a velocidade dos resultados. Este trabalho pretende facilitar as organizações interessadas na implementação de um modelo de qualidade, fornecendo descrições sobre alguns dos modelos de qualidade mais utilizados atualmente, bem como modelos de processos, guias e formulários que podem ser utilizados diretamente ou como uma base para a implementação do modelo desejado. Embora se aplique à implementação de qualquer modelo de qualidade, mais especificamente, este trabalho destina-se a auxiliar organizações que visem implementar o modelo CMMI nível 2 (doravante usaremos também a abreviação CMMI-N2). Para tanto, descreve de forma mais detalhada este modelo e fornece um caminho para a implementação do mesmo, incluindo a descrição de um processo mínimo de desenvolvimento de software, com base no RUP (Rational Unified Process) e o uso de um modelo de ciclo de vida de melhoria de processos, o IDEAL. Neste trabalho, propõe-se que seja utilizado o modelo IDEAL para a implementação do modelo de qualidade devido ao fato de este modelo ter sido concebido originalmente como um modelo de ciclo de vida para melhoria de processos de software com base no SW-CMM (Capability Maturity Model for Software). Associado a esse modelo, é sugerido que se utilize algumas técnicas e processos de gerência de projetos para cada área de processo do CMMI-N2, visando implantar cada área de processo através de um projeto. Para a implementação são propostos guias, modelos (formulários) de implementação e uma tabela que mapeia todas as áreas de processo do CMMI-N2, seus objetivos, práticas, produtos de trabalho e as ferramentas associadas.