63 resultados para Initial value problems


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper is concerned with solving numerically the Dirichlet boundary value problem for Laplace’s equation in a nonlocally perturbed half-plane. This problem arises in the simulation of classical unsteady water wave problems. The starting point for the numerical scheme is the boundary integral equation reformulation of this problem as an integral equation of the second kind on the real line in Preston et al. (2008, J. Int. Equ. Appl., 20, 121–152). We present a Nystr¨om method for numerical solution of this integral equation and show stability and convergence, and we present and analyse a numerical scheme for computing the Dirichlet-to-Neumann map, i.e., for deducing the instantaneous fluid surface velocity from the velocity potential on the surface, a key computational step in unsteady water wave simulations. In particular, we show that our numerical schemes are superalgebraically convergent if the fluid surface is infinitely smooth. The theoretical results are illustrated by numerical experiments.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Boltzmann equation in presence of boundary and initial conditions, which describes the general case of carrier transport in microelectronic devices is analysed in terms of Monte Carlo theory. The classical Ensemble Monte Carlo algorithm which has been devised by merely phenomenological considerations of the initial and boundary carrier contributions is now derived in a formal way. The approach allows to suggest a set of event-biasing algorithms for statistical enhancement as an alternative of the population control technique, which is virtually the only algorithm currently used in particle simulators. The scheme of the self-consistent coupling of Boltzmann and Poisson equation is considered for the case of weighted particles. It is shown that particles survive the successive iteration steps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a new methodology for comparing satellite radiation budget data with a numerical weather prediction (NWP) model. This is applied to data from the Geostationary Earth Radiation Budget (GERB) instrument on Meteosat-8. The methodology brings together, in near-real time, GERB broadband shortwave and longwave fluxes with simulations based on analyses produced by the Met Office global NWP model. Results for the period May 2003 to February 2005 illustrate the progressive improvements in the data products as various initial problems were resolved. In most areas the comparisons reveal systematic errors in the model's representation of surface properties and clouds, which are discussed elsewhere. However, for clear-sky regions over the oceans the model simulations are believed to be sufficiently accurate to allow the quality of the GERB fluxes themselves to be assessed and any changes in time of the performance of the instrument to be identified. Using model and radiosonde profiles of temperature and humidity as input to a single-column version of the model's radiation code, we conduct sensitivity experiments which provide estimates of the expected model errors over the ocean of about ±5–10 W m−2 in clear-sky outgoing longwave radiation (OLR) and ±0.01 in clear-sky albedo. For the more recent data the differences between the observed and modeled OLR and albedo are well within these error estimates. The close agreement between the observed and modeled values, particularly for the most recent period, illustrates the value of the methodology. It also contributes to the validation of the GERB products and increases confidence in the quality of the data, prior to their release.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the very large nonlinear dynamical systems that arise in a wide range of physical, biological and environmental problems, the data needed to initialize a numerical forecasting model are seldom available. To generate accurate estimates of the expected states of the system, both current and future, the technique of ‘data assimilation’ is used to combine the numerical model predictions with observations of the system measured over time. Assimilation of data is an inverse problem that for very large-scale systems is generally ill-posed. In four-dimensional variational assimilation schemes, the dynamical model equations provide constraints that act to spread information into data sparse regions, enabling the state of the system to be reconstructed accurately. The mechanism for this is not well understood. Singular value decomposition techniques are applied here to the observability matrix of the system in order to analyse the critical features in this process. Simplified models are used to demonstrate how information is propagated from observed regions into unobserved areas. The impact of the size of the observational noise and the temporal position of the observations is examined. The best signal-to-noise ratio needed to extract the most information from the observations is estimated using Tikhonov regularization theory. Copyright © 2005 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Airborne scanning laser altimetry (LiDAR) is an important new data source for river flood modelling. LiDAR can give dense and accurate DTMs of floodplains for use as model bathymetry. Spatial resolutions of 0.5m or less are possible, with a height accuracy of 0.15m. LiDAR gives a Digital Surface Model (DSM), so vegetation removal software (e.g. TERRASCAN) must be used to obtain a DTM. An example used to illustrate the current state of the art will be the LiDAR data provided by the EA, which has been processed by their in-house software to convert the raw data to a ground DTM and separate vegetation height map. Their method distinguishes trees from buildings on the basis of object size. EA data products include the DTM with or without buildings removed, a vegetation height map, a DTM with bridges removed, etc. Most vegetation removal software ignores short vegetation less than say 1m high. We have attempted to extend vegetation height measurement to short vegetation using local height texture. Typically most of a floodplain may be covered in such vegetation. The idea is to assign friction coefficients depending on local vegetation height, so that friction is spatially varying. This obviates the need to calibrate a global floodplain friction coefficient. It’s not clear at present if the method is useful, but it’s worth testing further. The LiDAR DTM is usually determined by looking for local minima in the raw data, then interpolating between these to form a space-filling height surface. This is a low pass filtering operation, in which objects of high spatial frequency such as buildings, river embankments and walls may be incorrectly classed as vegetation. The problem is particularly acute in urban areas. A solution may be to apply pattern recognition techniques to LiDAR height data fused with other data types such as LiDAR intensity or multispectral CASI data. We are attempting to use digital map data (Mastermap structured topography data) to help to distinguish buildings from trees, and roads from areas of short vegetation. The problems involved in doing this will be discussed. A related problem of how best to merge historic river cross-section data with a LiDAR DTM will also be considered. LiDAR data may also be used to help generate a finite element mesh. In rural area we have decomposed a floodplain mesh according to taller vegetation features such as hedges and trees, so that e.g. hedge elements can be assigned higher friction coefficients than those in adjacent fields. We are attempting to extend this approach to urban area, so that the mesh is decomposed in the vicinity of buildings, roads, etc as well as trees and hedges. A dominant points algorithm is used to identify points of high curvature on a building or road, which act as initial nodes in the meshing process. A difficulty is that the resulting mesh may contain a very large number of nodes. However, the mesh generated may be useful to allow a high resolution FE model to act as a benchmark for a more practical lower resolution model. A further problem discussed will be how best to exploit data redundancy due to the high resolution of the LiDAR compared to that of a typical flood model. Problems occur if features have dimensions smaller than the model cell size e.g. for a 5m-wide embankment within a raster grid model with 15m cell size, the maximum height of the embankment locally could be assigned to each cell covering the embankment. But how could a 5m-wide ditch be represented? Again, this redundancy has been exploited to improve wetting/drying algorithms using the sub-grid-scale LiDAR heights within finite elements at the waterline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two commercial enzyme products, Depol 40 (D) and Liquicell 2500 (L), were characterised from a biochemical standpoint and their potential to improve rumen degradation of forages was evaluated in vitro. Enzyme activities were determined at pH 5.5 and 39 degreesC. Analysis of the enzyme activities indicated that L contained higher xylanase and endoglucanase, but lower exoglucanase, pectinase and alpha-amylase activities than D. The Reading Pressure Technique (RPT) was used to investigate the effect of enzyme addition on the in vitro gas production (GP) and organic matter degradation (OMD) of alfalfa (Medicago sativa L.) stems and leaves. A completely randomised design with factorial arrangement of treatments was used. Both alfalfa fractions were untreated or treated with each enzyme at four levels, 20 h before incubation with rumen fluid. Each level of enzyme provided similar amounts of filter paper (D1, L1), endoglucanase (D2, L2), alpha-L-arabinofuranosidase (D3, L3) and xylanase units (D4, L4) per gram forage DM. Enzymes increased the initial OMD in both fractions, with improvements of up to 15% in leaves (D4) and 8% in stems (L2) after 12 h incubation. All enzyme treatments increased the extent of degradation (96 h incubation) in the leaf fractions, but only L2 increased final OMD in the stems. Direct hydrolysis of forage fractions during the pre-treatment period did not fully account for the magnitude of the increases in OMD, suggesting that the increase in rate of degradation was achieved through a combined effect of direct enzyme hydrolysis and synergistic action between the exogenous (applied) and endogenous (rumen) enzymes. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building services are worth about 2% GDP and are essential for the effective and efficient operations of the building. It is increasingly recognised that the value of a building is related to the way it supports the client organisation’s ongoing business operations. Building services are central to the functional performance of buildings and provide the necessary conditions for health, well-being, safety and security of the occupants. They frequently comprise several technologically distinct sub-systems and their design and construction requires the involvement of numerous disciplines and trades. Designers and contractors working on the same project are frequently employed by different companies. Materials and equipment is supplied by a diverse range of manufacturers. Facilities managers are responsible for operation of the building service in use. The coordination between these participants is crucially important to achieve optimum performance, but too often is neglected. This leaves room for serious faults. The need for effective integration is important. Modern technology offers increasing opportunities for integrated personal-control systems for lighting, ventilation and security as well as interoperability between systems. Opportunities for a new mode of systems integration are provided by the emergence of PFI/PPP procurements frameworks. This paper attempts to establish how systems integration can be achieved in the process of designing, constructing and operating building services. The essence of the paper therefore is to envisage the emergent organisational responses to the realisation of building services as an interactive systems network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces PSOPT, an open source optimal control solver written in C++. PSOPT uses pseudospectral and local discretizations, sparse nonlinear programming, automatic differentiation, and it incorporates automatic scaling and mesh refinement facilities. The software is able to solve complex optimal control problems including multiple phases, delayed differential equations, nonlinear path constraints, interior point constraints, integral constraints, and free initial and/or final times. The software does not require any non-free platform to run, not even the operating system, as it is able to run under Linux. Additionally, the software generates plots as well as LATEX code so that its results can easily be included in publications. An illustrative example is provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A key strategy to improve the skill of quantitative predictions of precipitation, as well as hazardous weather such as severe thunderstorms and flash floods is to exploit the use of observations of convective activity (e.g. from radar). In this paper, a convection-permitting ensemble prediction system (EPS) aimed at addressing the problems of forecasting localized weather events with relatively short predictability time scale and based on a 1.5 km grid-length version of the Met Office Unified Model is presented. Particular attention is given to the impact of using predicted observations of radar-derived precipitation intensity in the ensemble transform Kalman filter (ETKF) used within the EPS. Our initial results based on the use of a 24-member ensemble of forecasts for two summer case studies show that the convective-scale EPS produces fairly reliable forecasts of temperature, horizontal winds and relative humidity at 1 h lead time, as evident from the inspection of rank histograms. On the other hand, the rank histograms seem also to show that the EPS generates too much spread for forecasts of (i) surface pressure and (ii) surface precipitation intensity. These may indicate that for (i) the value of surface pressure observation error standard deviation used to generate surface pressure rank histograms is too large and for (ii) may be the result of non-Gaussian precipitation observation errors. However, further investigations are needed to better understand these findings. Finally, the inclusion of predicted observations of precipitation from radar in the 24-member EPS considered in this paper does not seem to improve the 1-h lead time forecast skill.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Classical risk assessment approaches for animal diseases are influenced by the probability of release, exposure and consequences of a hazard affecting a livestock population. Once a pathogen enters into domestic livestock, potential risks of exposure and infection both to animals and people extend through a chain of economic activities related to producing, buying and selling of animals and products. Therefore, in order to understand economic drivers of animal diseases in different ecosystems and to come up with effective and efficient measures to manage disease risks from a country or region, the entire value chain and related markets for animal and product needs to be analysed to come out with practical and cost effective risk management options agreed by actors and players on those value chains. Value chain analysis enriches disease risk assessment providing a framework for interdisciplinary collaboration, which seems to be in increasing demand for problems concerning infectious livestock diseases. The best way to achieve this is to ensure that veterinary epidemiologists and social scientists work together throughout the process at all levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ensemble forecasting of nonlinear systems involves the use of a model to run forward a discrete ensemble (or set) of initial states. Data assimilation techniques tend to focus on estimating the true state of the system, even though model error limits the value of such efforts. This paper argues for choosing the initial ensemble in order to optimise forecasting performance rather than estimate the true state of the system. Density forecasting and choosing the initial ensemble are treated as one problem. Forecasting performance can be quantified by some scoring rule. In the case of the logarithmic scoring rule, theoretical arguments and empirical results are presented. It turns out that, if the underlying noise dominates model error, we can diagnose the noise spread.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formal statutory guidance to arbitrators involved in settling disputes over rents for agricultural holdings is contained in the Agricultural Holdings Act 1986. The particular features of the agricultural letting market raise valuation problems which the Act itself has failed to satisfactorily address, most notably the degree to which marriage value and scarcity should be taken into account. The 1995 Court of Appeal case of Childers v Anker addresses several of the key issues. This paper seeks to explore the findings and practical implications of the case for rental valuers and arbitrators. It argues that sitting tenants may be seriously disadvantaged by the court's judgements, not least by having to pay rents on review which reflect elements of marriage value and possibly scarcity value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimal state estimation from given observations of a dynamical system by data assimilation is generally an ill-posed inverse problem. In order to solve the problem, a standard Tikhonov, or L2, regularization is used, based on certain statistical assumptions on the errors in the data. The regularization term constrains the estimate of the state to remain close to a prior estimate. In the presence of model error, this approach does not capture the initial state of the system accurately, as the initial state estimate is derived by minimizing the average error between the model predictions and the observations over a time window. Here we examine an alternative L1 regularization technique that has proved valuable in image processing. We show that for examples of flow with sharp fronts and shocks, the L1 regularization technique performs more accurately than standard L2 regularization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A neurofuzzy classifier identification algorithm is introduced for two class problems. The initial fuzzy base construction is based on fuzzy clustering utilizing a Gaussian mixture model (GMM) and the analysis of covariance (ANOVA) decomposition. The expectation maximization (EM) algorithm is applied to determine the parameters of the fuzzy membership functions. Then neurofuzzy model is identified via the supervised subspace orthogonal least square (OLS) algorithm. Finally a logistic regression model is applied to produce the class probability. The effectiveness of the proposed neurofuzzy classifier has been demonstrated using a real data set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops a conceptual framework for analyzing emerging agricultural hydrology problems in post-conflict Libya. Libya is one of the most arid regions on the planet. Thus, as well as substantial political and social changes, post-conflict Libyan administrators are confronted with important hydrological issues in Libya’s emerging water-landuse complex. This paper presents a substantial background to the water-land-use problem in Libya; reviews previous work in Libya and elsewhere on water-land-use issues and water-land-use conflicts in the dry and arid zones; outlines a conceptual framework for fruitful research interventions; and details the results of a survey conducted on Libyan farmers’ water usage, perceptions of emerging water-land-use conflicts and the appropriate value one should place on agricultural-use hydrological resources in Libya. Extensions are discussed.