11 resultados para look-ahead system
em CentAUR: Central Archive University of Reading - UK
Resumo:
The complexity inherent in climate data makes it necessary to introduce more than one statistical tool to the researcher to gain insight into the climate system. Empirical orthogonal function (EOF) analysis is one of the most widely used methods to analyze weather/climate modes of variability and to reduce the dimensionality of the system. Simple structure rotation of EOFs can enhance interpretability of the obtained patterns but cannot provide anything more than temporal uncorrelatedness. In this paper, an alternative rotation method based on independent component analysis (ICA) is considered. The ICA is viewed here as a method of EOF rotation. Starting from an initial EOF solution rather than rotating the loadings toward simplicity, ICA seeks a rotation matrix that maximizes the independence between the components in the time domain. If the underlying climate signals have an independent forcing, one can expect to find loadings with interpretable patterns whose time coefficients have properties that go beyond simple noncorrelation observed in EOFs. The methodology is presented and an application to monthly means sea level pressure (SLP) field is discussed. Among the rotated (to independence) EOFs, the North Atlantic Oscillation (NAO) pattern, an Arctic Oscillation–like pattern, and a Scandinavian-like pattern have been identified. There is the suggestion that the NAO is an intrinsic mode of variability independent of the Pacific.
Resumo:
It is now possible to directly link the human nervous system to a computer and thence onto the Internet. From an electronic and mental viewpoint this means that the Internet becomes an extension of the human nervous system (and vice versa). Such a connection on a regular or mass basis will have far reaching effects for society. In this article the authors discuss their own practical implant self-experimentation, especially insofar as it relates to extending the human nervous system. Trials involving an intercontinental link up are described. As well as technical aspects of the work, social, moral and ethical issues, as perceived by the authors, are weighed against potential technical gains. The authors also look at technical limitations inherent in the co-evolution of Internet implanted individuals as well as the future distribution of intelligence between human and machine.
Resumo:
We discuss the feasibility of wireless terahertz communications links deployed in a metropolitan area and model the large-scale fading of such channels. The model takes into account reception through direct line of sight, ground and wall reflection, as well as diffraction around a corner. The movement of the receiver is modeled by an autonomous dynamic linear system in state space, whereas the geometric relations involved in the attenuation and multipath propagation of the electric field are described by a static nonlinear mapping. A subspace algorithm in conjunction with polynomial regression is used to identify a single-output Wiener model from time-domain measurements of the field intensity when the receiver motion is simulated using a constant angular speed and an exponentially decaying radius. The identification procedure is validated by using the model to perform q-step ahead predictions. The sensitivity of the algorithm to small-scale fading, detector noise, and atmospheric changes are discussed. The performance of the algorithm is tested in the diffraction zone assuming a range of emitter frequencies (2, 38, 60, 100, 140, and 400 GHz). Extensions of the simulation results to situations where a more complicated trajectory describes the motion of the receiver are also implemented, providing information on the performance of the algorithm under a worst case scenario. Finally, a sensitivity analysis to model parameters for the identified Wiener system is proposed.
Resumo:
In this paper stability of one-step ahead predictive controllers based on non-linear models is established. It is shown that, under conditions which can be fulfilled by most industrial plants, the closed-loop system is robustly stable in the presence of plant uncertainties and input–output constraints. There is no requirement that the plant should be open-loop stable and the analysis is valid for general forms of non-linear system representation including the case out when the problem is constraint-free. The effectiveness of controllers designed according to the algorithm analyzed in this paper is demonstrated on a recognized benchmark problem and on a simulation of a continuous-stirred tank reactor (CSTR). In both examples a radial basis function neural network is employed as the non-linear system model.
Resumo:
A look is taken at the use of radial basis functions (RBFs), for nonlinear system identification. RBFs are firstly considered in detail themselves and are subsequently compared with a multi-layered perceptron (MLP), in terms of performance and usage.
Resumo:
Current methods for estimating vegetation parameters are generally sub-optimal in the way they exploit information and do not generally consider uncertainties. We look forward to a future where operational dataassimilation schemes improve estimates by tracking land surface processes and exploiting multiple types of observations. Dataassimilation schemes seek to combine observations and models in a statistically optimal way taking into account uncertainty in both, but have not yet been much exploited in this area. The EO-LDAS scheme and prototype, developed under ESA funding, is designed to exploit the anticipated wealth of data that will be available under GMES missions, such as the Sentinel family of satellites, to provide improved mapping of land surface biophysical parameters. This paper describes the EO-LDAS implementation, and explores some of its core functionality. EO-LDAS is a weak constraint variational dataassimilationsystem. The prototype provides a mechanism for constraint based on a prior estimate of the state vector, a linear dynamic model, and EarthObservationdata (top-of-canopy reflectance here). The observation operator is a non-linear optical radiative transfer model for a vegetation canopy with a soil lower boundary, operating over the range 400 to 2500 nm. Adjoint codes for all model and operator components are provided in the prototype by automatic differentiation of the computer codes. In this paper, EO-LDAS is applied to the problem of daily estimation of six of the parameters controlling the radiative transfer operator over the course of a year (> 2000 state vector elements). Zero and first order process model constraints are implemented and explored as the dynamic model. The assimilation estimates all state vector elements simultaneously. This is performed in the context of a typical Sentinel-2 MSI operating scenario, using synthetic MSI observations simulated with the observation operator, with uncertainties typical of those achieved by optical sensors supposed for the data. The experiments consider a baseline state vector estimation case where dynamic constraints are applied, and assess the impact of dynamic constraints on the a posteriori uncertainties. The results demonstrate that reductions in uncertainty by a factor of up to two might be obtained by applying the sorts of dynamic constraints used here. The hyperparameter (dynamic model uncertainty) required to control the assimilation are estimated by a cross-validation exercise. The result of the assimilation is seen to be robust to missing observations with quite large data gaps.
Resumo:
The relevance of chaotic advection to stratospheric mixing and transport is addressed in the context of (i) a numerical model of forced shallow-water flow on the sphere, and (ii) a middle-atmosphere general circulation model. It is argued that chaotic advection applies to both these models if there is suitable large-scale spatial structure in the velocity field and if the velocity field is temporally quasi-regular. This spatial structure is manifested in the form of “cat’s eyes” in the surf zone, such as are commonly seen in numerical simulations of Rossby wave critical layers; by analogy with the heteroclinic structure of a temporally aperiodic chaotic system the cat’s eyes may be thought of as an “organizing structure” for mixing and transport in the surf zone. When this organizing structure exists, Eulerian and Lagrangian autocorrelations of the velocity derivatives indicate that velocity derivatives decorrelate more rapidly along particle trajectories than at fixed spatial locations (i.e., the velocity field is temporally quasi-regular). This phenomenon is referred to as Lagrangian random strain.
Resumo:
During the last 30 years, significant debate has taken place regarding multilevel research. However, the extent to which multilevel research is overtly practiced remains to be examined. This article analyzes 10 years of organizational research within a multilevel framework (from 2001 to 2011). The goals of this article are (a) to understand what has been done, during this decade, in the field of organizational multilevel research and (b) to suggest new arenas of research for the next decade. A total of 132 articles were selected for analysis through ISI Web of Knowledge. Through a broad-based literature review, results suggest that there is equilibrium between the amount of empirical and conceptual papers regarding multilevel research, with most studies addressing the cross-level dynamics between teams and individuals. In addition, this study also found that the time still has little presence in organizational multilevel research. Implications, limitations, and future directions are addressed in the end. Organizations are made of interacting layers. That is, between layers (such as divisions, departments, teams, and individuals) there is often some degree of interdependence that leads to bottom-up and top-down influence mechanisms. Teams and organizations are contexts for the development of individual cognitions, attitudes, and behaviors (top-down effects; Kozlowski & Klein, 2000). Conversely, individual cognitions, attitudes, and behaviors can also influence the functioning and outcomes of teams and organizations (bottom-up effects; Arrow, McGrath, & Berdahl, 2000). One example is when the rewards system of one organization may influence employees’ intention to quit and the existence or absence of extra role behaviors. At the same time, many studies have showed the importance of bottom-up emergent processes that yield higher level phenomena (Bashshur, Hernández, & González-Romá, 2011; Katz-Navon & Erez, 2005; Marques-Quinteiro, Curral, Passos, & Lewis, in press). For example, the affectivity of individual employees may influence their team’s interactions and outcomes (Costa, Passos, & Bakker, 2012). Several authors agree that organizations must be understood as multilevel systems, meaning that adopting a multilevel perspective is fundamental to understand real-world phenomena (Kozlowski & Klein, 2000). However, whether this agreement is reflected in practicing multilevel research seems to be less clear. In fact, how much is known about the quantity and quality of multilevel research done in the last decade? The aim of this study is to compare what has been proposed theoretically, concerning the importance of multilevel research, with what has really been empirically studied and published. First, this article outlines a review of the multilevel theory, followed by what has been theoretically “put forward” by researchers. Second, this article presents what has really been “practiced” based on the results of a review of multilevel studies published from 2001 to 2011 in business and management journals. Finally, some barriers and challenges to true multilevel research are suggested. This study contributes to multilevel research as it describes the last 10 years of research. It quantitatively depicts the type of articles being written, and where we can find the majority of the publications on empirical and conceptual work related to multilevel thinking.
Resumo:
This is the second half of a two-part paper dealing with the social theoretic assumptions underlying system dynamics. In the first half it was concluded that analysing system dynamics using traditional, paradigm-based social theories is highly problematic. An innovative and potentially fruitful resolution is now proposed to these problems. In the first section it is argued that in order to find an appropriate social theoretic home for system dynamics it is necessary to look to a key exchange in contemporary social science: the agency/structure debate. This debate aims to move beyond both the theories based only on the actions of individual human agents, and those theories that emphasise only structural influences. Emerging from this debate are various theories that instead aim to unite the human agent view of the social realm with views that concentrate solely on system structure. It is argued that system dynamics is best viewed as being implicitly grounded in such theories. The main conclusion is therefore that system dynamics can contribute to an important part of social thinking by providing a formal approach for explicating social mechanisms. This conclusion is of general significance for system dynamics. However, the over-arching aim of the two-part paper is to increase the understanding of system dynamics in related disciplines. Four suggestions are therefore offered for how the system dynamics method might be extended further into the social sciences. It is argued that, presented in the right way, the formal yet contingent feedback causality thinking of system dynamics should diffuse widely in the social sciences and make a distinctive and important contribution to them. Felix qui potuit rerum cognoscere causas Happy is he who comes to know the causes of things Virgil - Georgics, Book II, line 490. 29 BCE
Resumo:
The Japanese government’s justification for retaining the death penalty is that abolition would erode the legitimacy of and public trust in the criminal justice system, leading to victims’ families taking justice into their own hands. This justification is based on the results of a regularly administered public opinion survey, which is said to show strong public support for the death penalty. However, a close analysis of the results of the 2014 survey fails to validate this claim. Just over a third of respondents were committed to retaining the death penalty at all costs, while the rest accepted the possibility of future abolition, with some of them seeing this as contingent on the introduction of life imprisonment without parole as an alternative sentence. These findings hardly describe a society that expects the strict application of the death penalty and whose trust in justice depends on the government’s commitment to retaining it. My reading of the 2014 survey is that the Japanese public is ready to embrace abolition. Japan, after all, is a signatory to the International Covenant on Civil and Political Rights, which calls on states not to delay or prevent abolition, so this should be welcome news for the Japanese government!
Resumo:
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961–2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño–Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.