18 resultados para State Universities Retirement System (Ill.)
em CentAUR: Central Archive University of Reading - UK
Resumo:
Optimal state estimation from given observations of a dynamical system by data assimilation is generally an ill-posed inverse problem. In order to solve the problem, a standard Tikhonov, or L2, regularization is used, based on certain statistical assumptions on the errors in the data. The regularization term constrains the estimate of the state to remain close to a prior estimate. In the presence of model error, this approach does not capture the initial state of the system accurately, as the initial state estimate is derived by minimizing the average error between the model predictions and the observations over a time window. Here we examine an alternative L1 regularization technique that has proved valuable in image processing. We show that for examples of flow with sharp fronts and shocks, the L1 regularization technique performs more accurately than standard L2 regularization.
Resumo:
For the very large nonlinear dynamical systems that arise in a wide range of physical, biological and environmental problems, the data needed to initialize a numerical forecasting model are seldom available. To generate accurate estimates of the expected states of the system, both current and future, the technique of ‘data assimilation’ is used to combine the numerical model predictions with observations of the system measured over time. Assimilation of data is an inverse problem that for very large-scale systems is generally ill-posed. In four-dimensional variational assimilation schemes, the dynamical model equations provide constraints that act to spread information into data sparse regions, enabling the state of the system to be reconstructed accurately. The mechanism for this is not well understood. Singular value decomposition techniques are applied here to the observability matrix of the system in order to analyse the critical features in this process. Simplified models are used to demonstrate how information is propagated from observed regions into unobserved areas. The impact of the size of the observational noise and the temporal position of the observations is examined. The best signal-to-noise ratio needed to extract the most information from the observations is estimated using Tikhonov regularization theory. Copyright © 2005 John Wiley & Sons, Ltd.
Resumo:
Four-dimensional variational data assimilation (4D-Var) is used in environmental prediction to estimate the state of a system from measurements. When 4D-Var is applied in the context of high resolution nested models, problems may arise in the representation of spatial scales longer than the domain of the model. In this paper we study how well 4D-Var is able to estimate the whole range of spatial scales present in one-way nested models. Using a model of the one-dimensional advection–diffusion equation we show that small spatial scales that are observed can be captured by a 4D-Var assimilation, but that information in the larger scales may be degraded. We propose a modification to 4D-Var which allows a better representation of these larger scales.
Resumo:
Ensemble forecasting of nonlinear systems involves the use of a model to run forward a discrete ensemble (or set) of initial states. Data assimilation techniques tend to focus on estimating the true state of the system, even though model error limits the value of such efforts. This paper argues for choosing the initial ensemble in order to optimise forecasting performance rather than estimate the true state of the system. Density forecasting and choosing the initial ensemble are treated as one problem. Forecasting performance can be quantified by some scoring rule. In the case of the logarithmic scoring rule, theoretical arguments and empirical results are presented. It turns out that, if the underlying noise dominates model error, we can diagnose the noise spread.
Resumo:
Cholesterol is one of the key constituents for maintaining the cellular membrane and thus the integrity of the cell itself. In contrast high levels of cholesterol in the blood are known to be a major risk factor in the development of cardiovascular disease. We formulate a deterministic nonlinear ordinary differential equation model of the sterol regulatory element binding protein 2 (SREBP-2) cholesterol genetic regulatory pathway in an hepatocyte. The mathematical model includes a description of genetic transcription by SREBP-2 which is subsequently translated to mRNA leading to the formation of 3-hydroxy-3-methylglutaryl coenzyme A reductase (HMGCR), a main precursor of cholesterol synthesis. Cholesterol synthesis subsequently leads to the regulation of SREBP-2 via a negative feedback formulation. Parameterised with data from the literature, the model is used to understand how SREBP-2 transcription and regulation affects cellular cholesterol concentration. Model stability analysis shows that the only positive steady-state of the system exhibits purely oscillatory, damped oscillatory or monotic behaviour under certain parameter conditions. In light of our findings we postulate how cholesterol homestasis is maintained within the cell and the advantages of our model formulation are discussed with respect to other models of genetic regulation within the literature.
Resumo:
Variational data assimilation is commonly used in environmental forecasting to estimate the current state of the system from a model forecast and observational data. The assimilation problem can be written simply in the form of a nonlinear least squares optimization problem. However the practical solution of the problem in large systems requires many careful choices to be made in the implementation. In this article we present the theory of variational data assimilation and then discuss in detail how it is implemented in practice. Current solutions and open questions are discussed.
Resumo:
The disadvantage of the majority of data assimilation schemes is the assumption that the conditional probability density function of the state of the system given the observations [posterior probability density function (PDF)] is distributed either locally or globally as a Gaussian. The advantage, however, is that through various different mechanisms they ensure initial conditions that are predominantly in linear balance and therefore spurious gravity wave generation is suppressed. The equivalent-weights particle filter is a data assimilation scheme that allows for a representation of a potentially multimodal posterior PDF. It does this via proposal densities that lead to extra terms being added to the model equations and means the advantage of the traditional data assimilation schemes, in generating predominantly balanced initial conditions, is no longer guaranteed. This paper looks in detail at the impact the equivalent-weights particle filter has on dynamical balance and gravity wave generation in a primitive equation model. The primary conclusions are that (i) provided the model error covariance matrix imposes geostrophic balance, then each additional term required by the equivalent-weights particle filter is also geostrophically balanced; (ii) the relaxation term required to ensure the particles are in the locality of the observations has little effect on gravity waves and actually induces a reduction in gravity wave energy if sufficiently large; and (iii) the equivalent-weights term, which leads to the particles having equivalent significance in the posterior PDF, produces a change in gravity wave energy comparable to the stochastic model error. Thus, the scheme does not produce significant spurious gravity wave energy and so has potential for application in real high-dimensional geophysical applications.
Resumo:
Integrations of a fully-coupled climate model with and without flux adjustments in the equatorial oceans are performed under 2×CO2 conditions to explore in more detail the impact of increased greenhouse gas forcing on the monsoon-ENSO system. When flux adjustments are used to correct some systematic model biases, ENSO behaviour in the modelled future climate features distinct irregular and periodic (biennial) regimes. Comparison with the observed record yields some consistency with ENSO modes primarily based on air-sea interaction and those dependent on basinwide ocean wave dynamics. Simple theory is also used to draw analogies between the regimes and irregular (stochastically forced) and self-excited oscillations respectively. Periodic behaviour is also found in the Asian-Australian monsoon system, part of an overall biennial tendency of the model under these conditions related to strong monsoon forcing and increased coupling between the Indian and Pacific Oceans. The tropospheric biennial oscillation (TBO) thus serves as a useful descriptor for the coupled monsoon-ENSO system in this case. The presence of obvious regime changes in the monsoon-ENSO system on interdecadal timescales, when using flux adjustments, suggests there may be greater uncertainty in projections of future climate, although further modelling studies are required to confirm the realism and cause of such changes.
Resumo:
The impact of doubled CO2 concentration on the Asian summer monsoon is studied using a coupled ocean-atmosphere model. Both the mean seasonal precipitation and interannual monsoon variability are found to increase in the future climate scenario presented. Systematic biases in current climate simulations of the coupled system prevent accurate representation of the monsoon-ENSO teleconnection, of prime importance for seasonal prediction and for determining monsoon interannual variability. By applying seasonally varying heat flux adjustments to the tropical Pacific and Indian Ocean surface in the future climate simulation, some assessment can be made of the impact of systematic model biases on future climate predictions. In simulations where the flux adjustments are implemented, the response to climate change is magnified, with the suggestion that systematic biases may be masking the true impact of increased greenhouse gas forcing. The teleconnection between ENSO and the Asian summer monsoon remains robust in the future climate, although the Indo-Pacific takes on more of a biennial character for long periods of the flux-adjusted simulation. Assessing the teleconnection across interdecadal timescales shows wide variations in its amplitude, despite the absence of external forcing. This suggests that recent changes in the observed record cannot be distinguished from internal variations and as such are not necessarily related to climate change.
Resumo:
The phase separation behaviour in aqueous mixtures of poly(methyl vinyl ether) and hydroxypropylcellulose has been studied by cloud points method and viscometric measurements. The miscibility of these blends in solid state has been assessed by infrared spectroscopy; methanol vapours sorption experiments and scanning electron microscopy. The values of Gibbs energy of mixing of the polymers and their blends with methanol as well as between each other were calculated. It was found that in solid state the polymers can interact with methanol very well but the polymer-polymer interactions are unfavourable. Although in aqueous solutions the polymers exhibit some intermolecular interactions their solid blends are not completely miscible. (C) 2005 Elsevier Ltd. All rights reserved.
Resumo:
National food control systems are a key element in the protection of consumers from unsafe foods and from other fraudulent practices. International guidance is available and provides a framework for enhancing national systems. However, it is recognized that before reaching decisions on the necessary improvements to a national system, an analysis is required of the current state of key elements in the present system. This paper provides such an analysis for the State of Kuwait. The fragmented nature of the food control system is described. Four key elements of the Kuwaiti system are analyzed: the legal framework, the administrative structures, the enforcement activity and the provision of education and training. It is noted that the country has a dependence on imported foods and that the present national food control system is largely based on an historic approach to food sampling at the point of import and is unsustainable. The paper recommends a more coordinated approach to food safety control in Kuwait with a significant increase in the use of risk analysis methods to target enforcement.
Resumo:
The concept of “distance to instability” of a system matrix is generalized to system pencils which arise in descriptor (semistate) systems. Difficulties arise in the case of singular systems, because the pencil can be made unstable by an infinitesimal perturbation. It is necessary to measure the distance subject to restricted, or structured, perturbations. In this paper a suitable measure for the stability radius of a generalized state-space system is defined, and a computable expression for the distance to instability is derived for regular pencils of index less than or equal to one. For systems which are strongly controllable it is shown that this measure is related to the sensitivity of the poles of the system over all feedback matrices assigning the poles.
Resumo:
Robustness in multi-variable control system design requires that the solution to the design problem be insensitive to perturbations in the system data. In this paper we discuss measures of robustness for generalized state-space, or descriptor, systems and describe algorithmic techniques for optimizing robustness for various applications.
Resumo:
Limnologists had an early preoccupation with lake classification. It gave a necessary structure to the many chemical and biological observations that were beginning to form the basis of one of the earliest truly environmental sciences. August Thienemann was the doyen of such classifiers and his concept with Einar Naumann of oligotrophic and eutrophic lakes remains central to the world-view that limnologists still have. Classification fell into disrepute, however, as it became clear that there would always be lakes that deviated from the prescriptions that the classifiers made for them. Continua became the de rigeur concept and lakes were seen as varying along many chemical, biological and geographic axes. Modern limnologists are comfortable with this concept. That all lakes are different guarantees an indefinite future for limnological research. For those who manage lakes and the landscapes in which they are set, however, it is not very useful. There may be as many as 300000 standing water bodies in England and Wales alone and maybe as many again in Scotland. More than 80 000 are sizable (> 1 ha). Some classification scheme to cope with these numbers is needed and, as human impacts on them increase, a system of assessing and monitoring change must be built into such a scheme. Although ways of classifying and monitoring running waters are well developed in the UK, the same is not true of standing waters. Sufficient understanding of what determines the nature and functioning of lakes exists to create a system which has intellectual credibility as well as practical usefulness. This paper outlines the thinking behind a system which will be workable on a north European basis and presents some early results.
Resumo:
Sri Lanka's participation rates in higher education are low and have risen only slightly in the last few decades; the number of places for higher education in the state university system only caters for around 3% of the university entrant age cohort. The literature reveals that the highly competitive global knowledge economy increasingly favours workers with high levels of education who are also lifelong learners. This lack of access to higher education for a sizable proportion of the labour force is identified as a severe impediment to Sri Lanka‟s competitiveness in the global knowledge economy. The literature also suggests that Information and Communication Technologies are increasingly relied upon in many contexts in order to deliver flexible learning, to cater especially for the needs of lifelong learners in today‟s higher educational landscape. The government of Sri Lanka invested heavily in ICTs for distance education during the period 2003-2009 in a bid to increase access to higher education; but there has been little research into the impact of this. To address this lack, this study investigated the impact of ICTs on distance education in Sri Lanka with respect to increasing access to higher education. In order to achieve this aim, the research focused on Sri Lanka‟s effort from three perspectives: policy perspective, implementation perspective and user perspective. A multiple case study research using an ethnographic approach was conducted to observe Orange Valley University‟s and Yellow Fields University‟s (pseudonymous) implementation of distance education programmes using questionnaires, qualitative interviewing and document analysis. In total, data for the analysis was collected from 129 questionnaires, 33 individual interviews and 2 group interviews. The research revealed that ICTs have indeed increased opportunities for higher education; but mainly for people of affluent families from the Western Province. Issues identified were categorized under the themes: quality assurance, location, language, digital literacies and access to resources. Recommendations were offered to tackle the identified issues in accordance with the study findings. The study also revealed the strong presence of a multifaceted digital divide in the country. In conclusion, this research has shown that iii although ICT-enabled distance education has the potential to increase access to higher education the present implementation of the system in Sri Lanka has been less than successful.