865 resultados para Time equivalent approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel approach is presented for combining spatial and temporal detail from newly available TRMM-based data sets to derive hourly rainfall intensities at 1-km spatial resolution for hydrological modelling applications. Time series of rainfall intensities derived from 3-hourly 0.25° TRMM 3B42 data are merged with a 1-km gridded rainfall climatology based on TRMM 2B31 data to account for the sub-grid spatial distribution of rainfall intensities within coarse-scale 0.25° grid cells. The method is implemented for two dryland catchments in Tunisia and Senegal, and validated against gauge data. The outcomes of the validation show that the spatially disaggregated and intensity corrected TRMM time series more closely approximate ground-based measurements than non-corrected data. The method introduced here enables the generation of rainfall intensity time series with realistic temporal and spatial detail for dynamic modelling of runoff and infiltration processes that are especially important to water resource management in arid regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The family of theories dubbed ‘luck egalitarianism’ represent an attempt to infuse egalitarian thinking with a concern for personal responsibility, arguing that inequalities are just when they result from, or the extent to which they result from, choice, but are unjust when they result from, or the extent to which they result from, luck. In this essay I argue that luck egalitarians should sometimes seek to limit inequalities, even when they have a fully choice-based pedigree (i.e., result only from the choices of agents). I grant that the broad approach is correct but argue that the temporal standpoint from which we judge whether the person can be held responsible, or the extent to which they can be held responsible, should be radically altered. Instead of asking, as Standard (or Static) Luck Egalitarianism seems to, whether or not, or to what extent, a person was responsible for the choice at the time of choosing, and asking the question of responsibility only once, we should ask whether, or to what extent, they are responsible for the choice at the point at which we are seeking to discover whether, or to what extent, the inequality is just, and so the question of responsibility is not settled but constantly under review. Such an approach will differ from Standard Luck Egalitarianism only if responsibility for a choice is not set in stone – if responsibility can weaken then we should not see the boundary between luck and responsibility within a particular action as static. Drawing on Derek Parfit’s illuminating discussions of personal identity, and contemporary literature on moral responsibility, I suggest there are good reasons to think that responsibility can weaken – that we are not necessarily fully responsible for a choice for ever, even if we were fully responsible at the time of choosing. I call the variant of luck egalitarianism that recognises this shift in temporal standpoint and that responsibility can weaken Dynamic Luck Egalitarianism (DLE). In conclusion I offer a preliminary discussion of what kind of policies DLE would support.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In estimating the inputs into the Modern Portfolio Theory (MPT) portfolio optimisation problem, it is usual to use equal weighted historic data. Equal weighting of the data, however, does not take account of the current state of the market. Consequently this approach is unlikely to perform well in any subsequent period as the data is still reflecting market conditions that are no longer valid. The need for some return-weighting scheme that gives greater weight to the most recent data would seem desirable. Therefore, this study uses returns data which are weighted to give greater weight to the most recent observations to see if such a weighting scheme can offer improved ex-ante performance over that based on un-weighted data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the formalism of the Ruelle response theory, we study how the invariant measure of an Axiom A dynamical system changes as a result of adding noise, and describe how the stochastic perturbation can be used to explore the properties of the underlying deterministic dynamics. We first find the expression for the change in the expectation value of a general observable when a white noise forcing is introduced in the system, both in the additive and in the multiplicative case. We also show that the difference between the expectation value of the power spectrum of an observable in the stochastically perturbed case and of the same observable in the unperturbed case is equal to the variance of the noise times the square of the modulus of the linear susceptibility describing the frequency-dependent response of the system to perturbations with the same spatial patterns as the considered stochastic forcing. This provides a conceptual bridge between the change in the fluctuation properties of the system due to the presence of noise and the response of the unperturbed system to deterministic forcings. Using Kramers-Kronig theory, it is then possible to derive the real and imaginary part of the susceptibility and thus deduce the Green function of the system for any desired observable. We then extend our results to rather general patterns of random forcing, from the case of several white noise forcings, to noise terms with memory, up to the case of a space-time random field. Explicit formulas are provided for each relevant case analysed. As a general result, we find, using an argument of positive-definiteness, that the power spectrum of the stochastically perturbed system is larger at all frequencies than the power spectrum of the unperturbed system. We provide an example of application of our results by considering the spatially extended chaotic Lorenz 96 model. These results clarify the property of stochastic stability of SRB measures in Axiom A flows, provide tools for analysing stochastic parameterisations and related closure ansatz to be implemented in modelling studies, and introduce new ways to study the response of a system to external perturbations. Taking into account the chaotic hypothesis, we expect that our results have practical relevance for a more general class of system than those belonging to Axiom A.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

User interaction within a virtual environment may take various forms: a teleconferencing application will require users to speak to each other (Geak, 1993), with computer supported co-operative working; an Engineer may wish to pass an object to another user for examination; in a battle field simulation (McDonough, 1992), users might exchange fire. In all cases it is necessary for the actions of one user to be presented to the others sufficiently quickly to allow realistic interaction. In this paper we take a fresh look at the approach of virtual reality operating systems by tackling the underlying issues of creating real-time multi-user environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A characterization of observability for linear time-varying descriptor systemsE(t)x(t)+F(t)x(t)=B(t)u(t), y(t)=C(t)x(t) was recently developed. NeitherE norC were required to have constant rank. This paper defines a dual system, and a type of controllability so that observability of the original system is equivalent to controllability of the dual system. Criteria for observability and controllability are given in terms of arrays of derivatives of the original coefficients. In addition, the duality results of this paper lead to an improvement on a previous fundamental structure result for solvable systems of the formE(t)x(t)+F(t)x(t)=f(tt).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Johne's disease in cattle is a contagious wasting disease caused by Mycobacterium avium subspecies paratuberculosis (MAP). Johne's infection is characterised by a long subclinical phase and can therefore go undetected for long periods of time during which substantial production losses can occur. The protracted nature of Johne's infection therefore presents a challenge for both veterinarians and farmers when discussing control options due to a paucity of information and limited test performance when screening for the disease. The objectives were to model Johne's control decisions in suckler beef cattle using a decision support approach, thus implying equal focus on ‘end user’ (veterinarian) participation whilst still focusing on the technical disease modelling aspects during the decision support model development. The model shows how Johne's disease is likely to affect a herd over time both in terms of physical and financial impacts. In addition, the model simulates the effect on production from two different Johne's control strategies; herd management measures and test and cull measures. The article also provides and discusses results from a sensitivity analysis to assess the effects on production from improving the currently available test performance. Output from running the model shows that a combination of management improvements to reduce routes of infection and testing and culling to remove infected and infectious animals is likely to be the least-cost control strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study linear variable coefficient control problems in descriptor form. Based on a behaviour approach and the general theory for linear differential algebraic systems we give the theoretical analysis and describe numerically stable methods to determine the structural properties of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless local area networks (WLANs) based on the IEEE 802.11 standard are now widespread. Most are used to provide access for mobile devices to a conventional wired infrastructure, and some are used where wires are not possible, forming an ad hoc network of their own. There are several varieties at the physical or radio layer (802.11, 802.11a, 802.11b, 802.11g), with each featuring different data rates, modulation schemes and transmission frequencies. However, all of them share a common medium access control (MAC) layer. As this is largely based on a contention approach, it does not allow prioritising of traffic or stations, so it cannot easily provide the quality of service (QoS) required by time-sensitive applications, such as voice or video transmission. In order to address this shortfall of the technology, the IEEE set up a task group that is aiming to enhance the MAC layer protocol so that it can provide QoS. The latest draft at the time of writing is Draft 11, dated October 2004. The article describes the yet-to-be-ratified 802.11e standard and is based on that draft.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the development of an export coefficient model to characterise the rates and sources of P export from land to water in four reservoir systems located in a semi-arid rural region in southern of Portugal. The model was developed to enable effective management of these important water resource systems under the EU Water Framework Directive. This is the first time such an approach has been fully adapted for the semi-arid systems typical of Mediterranean Europe. The sources of P loading delivered to each reservoir from its catchment were determined and scenario analysis was undertaken to predict the likely impact of catchment management strategies on the scale of rate of P loading delivered to each water body from its catchment. The results indicate the importance of farming and sewage treatment works/collective septic tanks discharges as the main contributors to the total diffuse and point source P loading delivered to the reservoirs, respectively. A reduction in the total P loading for all study areas would require control of farming practices and more efficient removal of P from human wastes prior to discharge to surface waters. The scenario analysis indicates a strategy based solely on reducing the agricultural P surplus may result in only a slow improvement in water quality, which would be unlikely to support the generation of good ecological status in reservoirs. The model application indicates that a reduction of P-inputs to the reservoirs should first focus on reducing P loading from sewage effluent discharges through the introduction of tertiary treatment (P-stripping) in all major residential areas. The fully calibrated export coefficient modelling approach transferred well to semi-arid regions, with the only significant limitation being the availability of suitable input data to drive the model. Further studies using this approach in semi-arid catchments are now needed to increase the knowledge of nutrient export behaviours in semi-arid regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flood extents caused by fluvial floods in urban and rural areas may be predicted by hydraulic models. Assimilation may be used to correct the model state and improve the estimates of the model parameters or external forcing. One common observation assimilated is the water level at various points along the modelled reach. Distributed water levels may be estimated indirectly along the flood extents in Synthetic Aperture Radar (SAR) images by intersecting the extents with the floodplain topography. It is necessary to select a subset of levels for assimilation because adjacent levels along the flood extent will be strongly correlated. A method for selecting such a subset automatically and in near real-time is described, which would allow the SAR water levels to be used in a forecasting model. The method first selects candidate waterline points in flooded rural areas having low slope. The waterline levels and positions are corrected for the effects of double reflections between the water surface and emergent vegetation at the flood edge. Waterline points are also selected in flooded urban areas away from radar shadow and layover caused by buildings, with levels similar to those in adjacent rural areas. The resulting points are thinned to reduce spatial autocorrelation using a top-down clustering approach. The method was developed using a TerraSAR-X image from a particular case study involving urban and rural flooding. The waterline points extracted proved to be spatially uncorrelated, with levels reasonably similar to those determined manually from aerial photographs, and in good agreement with those of nearby gauges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Variational data assimilation in continuous time is revisited. The central techniques applied in this paper are in part adopted from the theory of optimal nonlinear control. Alternatively, the investigated approach can be considered as a continuous time generalization of what is known as weakly constrained four-dimensional variational assimilation (4D-Var) in the geosciences. The technique allows to assimilate trajectories in the case of partial observations and in the presence of model error. Several mathematical aspects of the approach are studied. Computationally, it amounts to solving a two-point boundary value problem. For imperfect models, the trade-off between small dynamical error (i.e. the trajectory obeys the model dynamics) and small observational error (i.e. the trajectory closely follows the observations) is investigated. This trade-off turns out to be trivial if the model is perfect. However, even in this situation, allowing for minute deviations from the perfect model is shown to have positive effects, namely to regularize the problem. The presented formalism is dynamical in character. No statistical assumptions on dynamical or observational noise are imposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data assimilation refers to the problem of finding trajectories of a prescribed dynamical model in such a way that the output of the model (usually some function of the model states) follows a given time series of observations. Typically though, these two requirements cannot both be met at the same time–tracking the observations is not possible without the trajectory deviating from the proposed model equations, while adherence to the model requires deviations from the observations. Thus, data assimilation faces a trade-off. In this contribution, the sensitivity of the data assimilation with respect to perturbations in the observations is identified as the parameter which controls the trade-off. A relation between the sensitivity and the out-of-sample error is established, which allows the latter to be calculated under operational conditions. A minimum out-of-sample error is proposed as a criterion to set an appropriate sensitivity and to settle the discussed trade-off. Two approaches to data assimilation are considered, namely variational data assimilation and Newtonian nudging, also known as synchronization. Numerical examples demonstrate the feasibility of the approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The task of this paper is to develop a Time-Domain Probe Method for the reconstruction of impenetrable scatterers. The basic idea of the method is to use pulses in the time domain and the time-dependent response of the scatterer to reconstruct its location and shape. The method is based on the basic causality principle of timedependent scattering. The method is independent of the boundary condition and is applicable for limited aperture scattering data. In particular, we discuss the reconstruction of the shape of a rough surface in three dimensions from time-domain measurements of the scattered field. In practise, measurement data is collected where the incident field is given by a pulse. We formulate the time-domain fieeld reconstruction problem equivalently via frequency-domain integral equations or via a retarded boundary integral equation based on results of Bamberger, Ha-Duong, Lubich. In contrast to pure frequency domain methods here we use a time-domain characterization of the unknown shape for its reconstruction. Our paper will describe the Time-Domain Probe Method and relate it to previous frequency-domain approaches on sampling and probe methods by Colton, Kirsch, Ikehata, Potthast, Luke, Sylvester et al. The approach significantly extends recent work of Chandler-Wilde and Lines (2005) and Luke and Potthast (2006) on the timedomain point source method. We provide a complete convergence analysis for the method for the rough surface scattering case and provide numerical simulations and examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The administration of antisense oligonucleotides (AOs) to skip one or more exons in mutated forms of the DMD gene and so restore the reading frame of the transcript is one of the most promising approaches to treat Duchenne muscular dystrophy (DMD). At present, preclinical studies demonstrating the efficacy and safety of long-term AO administration have not been conducted. Furthermore, it is essential to determine the minimal effective dose and frequency of administration. In this study, two different low doses (LDs) of phosphorodiamidate morpholino oligomer (PMO) designed to skip the mutated exon 23 in the mdx dystrophic mouse were administered for up to 12 months. Mice treated for 50 weeks showed a substantial dose-related amelioration of the pathology, particularly in the diaphragm. Moreover, the generalized physical activity was profoundly enhanced compared to untreated mdx mice showing that widespread, albeit partial, dystrophin expression restores the normal activity in mdx mice. Our results show for the first time that a chronic long-term administration of LDs of unmodified PMO, equivalent to doses in use in DMD boys, is safe, significantly ameliorates the muscular dystrophic phenotype and improves the activity of dystrophin-deficient mice, thus encouraging the further clinical translation of this approach in humans.