898 resultados para Deterministic imputation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines everyday living room interactions in which teenage household members conduct `tactical' play in order to temporarily gain access to, and disrupt, the dominant, domestic codes of living room media. The practices of individuals are interpreted, through Michel de Certeau's language of `tactics', as struggles or a series of opportunistic actions which can often reforge these codes of living, precisely because the house `rules' are not fixed or deterministic in practice. In these tactical performances of self, the use of media is enmeshed in a host of situated and symbolic action, reaffirming how media and face-to-face interactions are multiply and closely entwined in everyday living room life. This video ethnographic work examines such instances of teenagers appealing to `house' rules and demonstrating domestic helpfulness in order to gain access to media, and the tethering of media to objects through the routine practice of `markers' and `stalls'.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many numerical models for weather prediction and climate studies are run at resolutions that are too coarse to resolve convection explicitly, but too fine to justify the local equilibrium assumed by conventional convective parameterizations. The Plant-Craig (PC) stochastic convective parameterization scheme, developed in this paper, solves this problem by removing the assumption that a given grid-scale situation must always produce the same sub-grid-scale convective response. Instead, for each timestep and gridpoint, one of the many possible convective responses consistent with the large-scale situation is randomly selected. The scheme requires as input the large-scale state as opposed to the instantaneous grid-scale state, but must nonetheless be able to account for genuine variations in the largescale situation. Here we investigate the behaviour of the PC scheme in three-dimensional simulations of radiative-convective equilibrium, demonstrating in particular that the necessary space-time averaging required to produce a good representation of the input large-scale state is not in conflict with the requirement to capture large-scale variations. The resulting equilibrium profiles agree well with those obtained from established deterministic schemes, and with corresponding cloud-resolving model simulations. Unlike the conventional schemes the statistics for mass flux and rainfall variability from the PC scheme also agree well with relevant theory and vary appropriately with spatial scale. The scheme is further shown to adapt automatically to changes in grid length and in forcing strength.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper argues for the relevance of paying attention to structuring participation processes across scales as one of the ways in which participation of multi-organisational partnerships that involve conflicting interests might be managed. Issue wise the paper deals with problems in connection with land mobilisation for road widening in complex and concentrated high value urban settings. It discusses a case study of plan implementation involving individual landowners, the land development market, the local government, other governmental and non-governmental organisations and the state government, which together achieved objectives that seemed impossible at first sight. In theoretical terms, the paper engages with Jessop's (2001) Strategic-Relational Approach (SRA), arguing for its potential for informing action in a way that is capable of achieving steering outputs. The claim for SRA is demonstrated by re-examining the case study. The factors that come through as SRA is applied are drawn out and it is suggested that the theory though non-deterministic, helps guide action by highlighting certain dynamics of systems that can be used for institutional intervention. These dynamics point to the importance of paying attention to scale and the way in which participation and negotiation processes are structured so as to favour certain outcomes rather than others

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates whether using natural logarithms (logs) of price indices for forecasting inflation rates is preferable to employing the original series. Univariate forecasts for annual inflation rates for a number of European countries and the USA based on monthly seasonal consumer price indices are considered. Stochastic seasonality and deterministic seasonality models are used. In many cases, the forecasts based on the original variables result in substantially smaller root mean squared errors than models based on logs. In turn, if forecasts based on logs are superior, the gains are typically small. This outcome sheds doubt on the common practice in the academic literature to forecast inflation rates based on differences of logs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce a model for a pair of nonlinear evolving networks, defined over a common set of vertices, sub ject to edgewise competition. Each network may grow new edges spontaneously or through triad closure. Both networks inhibit the other’s growth and encourage the other’s demise. These nonlinear stochastic competition equations yield to a mean field analysis resulting in a nonlinear deterministic system. There may be multiple equilibria; and bifurcations of different types are shown to occur within a reduced parameter space. This situation models competitive peer-to-peer communication networks such as BlackBerry Messenger displacing SMS; or instant messaging displacing emails.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In terrestrial television transmission multiple paths of various lengths can occur between the transmitter and the receiver. Such paths occur because of reflections from objects outside the direct transmission path. The multipath signals arriving at the receiver are all detected along with the intended signal causing time displaced replicas called 'ghosts' to appear on the television picture. With an increasing number of people living within built up areas, ghosting is becoming commonplace and therefore deghosting is becoming increasingly important. This thesis uses a deterministic time domain approach to deghosting, resulting in a simple solution to the problem of removing ghosts. A new video detector is presented which reduces the synchronous detector local oscillator phase error, caused by any practical size of ghost, to a lower level than has ever previously been achieved. From the new detector, dispersion of the video signal is minimised and a known closed-form time domain description of the individual ghost components within the detected video is subsequently obtained. Developed from mathematical descriptions of the detected video, a new specific deghoster filter structure is presented which is capable of removing both inphase (I) and also the phase quadrature (Q) induced ghost signals derived from the VSB operation. The new deghoster filter requires much less hardware than any previous deghoster which is capable of removing both I and Q ghost components. A new channel identification algorithm was also required and written which is based upon simple correlation techniques to find the delay and complex amplitude characteristics of individual ghosts. The result of the channel identification is then passed to the new I and Q deghoster filter for ghost cancellation. Generated from the research work performed for this thesis, five papers have been published. D

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The prediction of Northern Hemisphere (NH) extratropical cyclones by nine different ensemble prediction systems(EPSs), archived as part of The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE), has recently been explored using a cyclone tracking approach. This paper provides a continuation of this work, extending the analysis to the Southern Hemisphere (SH). While the EPSs have larger error in all cyclone properties in the SH, the relative performance of the different EPSs remains broadly consistent between the two hemispheres. Some interesting differences are also shown. The Chinese Meteorological Administration (CMA) EPS has a significantly lower level of performance in the SH compared to the NH. Previous NH results showed that the Centro de Previsao de Tempo e Estudos Climaticos (CPTEC) EPS underpredicts cyclone intensity. The results of this current study show that this bias is significantly larger in the SH. The CPTEC EPS also has very little spread in both hemispheres. As with the NH results, cyclone propagation speed is underpredicted by all the EPSs in the SH. To investigate this further, the bias was also computed for theECMWFhigh-resolution deterministic forecast. The bias was significantly smaller than the lower resolution ECMWF EPS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using the formalism of the Ruelle response theory, we study how the invariant measure of an Axiom A dynamical system changes as a result of adding noise, and describe how the stochastic perturbation can be used to explore the properties of the underlying deterministic dynamics. We first find the expression for the change in the expectation value of a general observable when a white noise forcing is introduced in the system, both in the additive and in the multiplicative case. We also show that the difference between the expectation value of the power spectrum of an observable in the stochastically perturbed case and of the same observable in the unperturbed case is equal to the variance of the noise times the square of the modulus of the linear susceptibility describing the frequency-dependent response of the system to perturbations with the same spatial patterns as the considered stochastic forcing. This provides a conceptual bridge between the change in the fluctuation properties of the system due to the presence of noise and the response of the unperturbed system to deterministic forcings. Using Kramers-Kronig theory, it is then possible to derive the real and imaginary part of the susceptibility and thus deduce the Green function of the system for any desired observable. We then extend our results to rather general patterns of random forcing, from the case of several white noise forcings, to noise terms with memory, up to the case of a space-time random field. Explicit formulas are provided for each relevant case analysed. As a general result, we find, using an argument of positive-definiteness, that the power spectrum of the stochastically perturbed system is larger at all frequencies than the power spectrum of the unperturbed system. We provide an example of application of our results by considering the spatially extended chaotic Lorenz 96 model. These results clarify the property of stochastic stability of SRB measures in Axiom A flows, provide tools for analysing stochastic parameterisations and related closure ansatz to be implemented in modelling studies, and introduce new ways to study the response of a system to external perturbations. Taking into account the chaotic hypothesis, we expect that our results have practical relevance for a more general class of system than those belonging to Axiom A.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider two weakly coupled systems and adopt a perturbative approach based on the Ruelle response theory to study their interaction. We propose a systematic way of parameterizing the effect of the coupling as a function of only the variables of a system of interest. Our focus is on describing the impacts of the coupling on the long term statistics rather than on the finite-time behavior. By direct calculation, we find that, at first order, the coupling can be surrogated by adding a deterministic perturbation to the autonomous dynamics of the system of interest. At second order, there are additionally two separate and very different contributions. One is a term taking into account the second-order contributions of the fluctuations in the coupling, which can be parameterized as a stochastic forcing with given spectral properties. The other one is a memory term, coupling the system of interest to its previous history, through the correlations of the second system. If these correlations are known, this effect can be implemented as a perturbation with memory on the single system. In order to treat this case, we present an extension to Ruelle's response theory able to deal with integral operators. We discuss our results in the context of other methods previously proposed for disentangling the dynamics of two coupled systems. We emphasize that our results do not rely on assuming a time scale separation, and, if such a separation exists, can be used equally well to study the statistics of the slow variables and that of the fast variables. By recursively applying the technique proposed here, we can treat the general case of multi-level systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper discusses ensemble behaviour in the Spiking Neuron Stochastic Diffusion Network, SNSDN, a novel network exploring biologically plausible information processing based on higher order temporal coding. SNSDN was proposed as an alternative solution to the binding problem [1]. SNSDN operation resembles Stochastic Diffusin on Search, SDS, a non-deterministic search algorithm able to rapidly locate the best instantiation of a target pattern within a noisy search space ([3], [5]). In SNSDN, relevant information is encoded in the length of interspike intervals. Although every neuron operates in its own time, ‘attention’ to a pattern in the search space results in self-synchronised activity of a large population of neurons. When multiple patterns are present in the search space, ‘switching of at- tention’ results in a change of the synchronous activity. The qualitative effect of attention on the synchronicity of spiking behaviour in both time and frequency domain will be discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A direct method is presented for determining the uncertainty in reservoir pressure, flow, and net present value (NPV) using the time-dependent, one phase, two- or three-dimensional equations of flow through a porous medium. The uncertainty in the solution is modelled as a probability distribution function and is computed from given statistical data for input parameters such as permeability. The method generates an expansion for the mean of the pressure about a deterministic solution to the system equations using a perturbation to the mean of the input parameters. Hierarchical equations that define approximations to the mean solution at each point and to the field covariance of the pressure are developed and solved numerically. The procedure is then used to find the statistics of the flow and the risked value of the field, defined by the NPV, for a given development scenario. This method involves only one (albeit complicated) solution of the equations and contrasts with the more usual Monte-Carlo approach where many such solutions are required. The procedure is applied easily to other physical systems modelled by linear or nonlinear partial differential equations with uncertain data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of calculating the probability of error in a DS/SSMA system has been extensively studied for more than two decades. When random sequences are employed some conditioning must be done before the application of the central limit theorem is attempted, leading to a Gaussian distribution. The authors seek to characterise the multiple access interference as a random-walk with a random number of steps, for random and deterministic sequences. Using results from random-walk theory, they model the interference as a K-distributed random variable and use it to calculate the probability of error in the form of a series, for a DS/SSMA system with a coherent correlation receiver and BPSK modulation under Gaussian noise. The asymptotic properties of the proposed distribution agree with other analyses. This is, to the best of the authors' knowledge, the first attempt to propose a non-Gaussian distribution for the interference. The modelling can be extended to consider multipath fading and general modulation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Decadal predictions have a high profile in the climate science community and beyond, yet very little is known about their skill. Nor is there any agreed protocol for estimating their skill. This paper proposes a sound and coordinated framework for verification of decadal hindcast experiments. The framework is illustrated for decadal hindcasts tailored to meet the requirements and specifications of CMIP5 (Coupled Model Intercomparison Project phase 5). The chosen metrics address key questions about the information content in initialized decadal hindcasts. These questions are: (1) Do the initial conditions in the hindcasts lead to more accurate predictions of the climate, compared to un-initialized climate change projections? and (2) Is the prediction model’s ensemble spread an appropriate representation of forecast uncertainty on average? The first question is addressed through deterministic metrics that compare the initialized and uninitialized hindcasts. The second question is addressed through a probabilistic metric applied to the initialized hindcasts and comparing different ways to ascribe forecast uncertainty. Verification is advocated at smoothed regional scales that can illuminate broad areas of predictability, as well as at the grid scale, since many users of the decadal prediction experiments who feed the climate data into applications or decision models will use the data at grid scale, or downscale it to even higher resolution. An overall statement on skill of CMIP5 decadal hindcasts is not the aim of this paper. The results presented are only illustrative of the framework, which would enable such studies. However, broad conclusions that are beginning to emerge from the CMIP5 results include (1) Most predictability at the interannual-to-decadal scale, relative to climatological averages, comes from external forcing, particularly for temperature; (2) though moderate, additional skill is added by the initial conditions over what is imparted by external forcing alone; however, the impact of initialization may result in overall worse predictions in some regions than provided by uninitialized climate change projections; (3) limited hindcast records and the dearth of climate-quality observational data impede our ability to quantify expected skill as well as model biases; and (4) as is common to seasonal-to-interannual model predictions, the spread of the ensemble members is not necessarily a good representation of forecast uncertainty. The authors recommend that this framework be adopted to serve as a starting point to compare prediction quality across prediction systems. The framework can provide a baseline against which future improvements can be quantified. The framework also provides guidance on the use of these model predictions, which differ in fundamental ways from the climate change projections that much of the community has become familiar with, including adjustment of mean and conditional biases, and consideration of how to best approach forecast uncertainty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we consider one-dimensional diffusions with constant coefficients in a finite interval with jump boundary and a certain deterministic jump distribution. We use coupling methods in order to identify the spectral gap in the case of a large drift and prove that there is a threshold drift above which the bottom of the spectrum no longer depends on the drift. As a corollary to our result we are able to answer two questions concerning elliptic eigenvalue problems with non-local boundary conditions formulated previously by Iddo Ben-Ari and Ross Pinsky.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Logistic models are studied as a tool to convert dynamical forecast information (deterministic and ensemble) into probability forecasts. A logistic model is obtained by setting the logarithmic odds ratio equal to a linear combination of the inputs. As with any statistical model, logistic models will suffer from overfitting if the number of inputs is comparable to the number of forecast instances. Computational approaches to avoid overfitting by regularization are discussed, and efficient techniques for model assessment and selection are presented. A logit version of the lasso (originally a linear regression technique), is discussed. In lasso models, less important inputs are identified and the corresponding coefficient is set to zero, providing an efficient and automatic model reduction procedure. For the same reason, lasso models are particularly appealing for diagnostic purposes.