181 resultados para conceptual data modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We argue the case for a new branch of mathematics and its applications: Mathematics for the Digital Society. There is a challenge for mathematics, a strong “pull” from new and emerging commercial and public activities; and a need to train and inspire a generation of quantitative scientists who will seek careers within the associated sectors. Although now going through an early phase of boiling up, prior to scholarly distillation, we discuss how data rich activities and applications may benefit from a wide range of continuous and discrete models, methods, analysis and inference. In ten years time such applications will be common place and associated courses may be embedded within the undergraduate curriculum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of eutrophication in river systems is poorly understood given the complex relationship between fixed plants, algae, hydrodynamics, water chemistry and solar radiation. However there is a pressing need to understand the relationship between the ecological status of rivers and the controlling environmental factors to help the reasoned implementation of the Water Framework Directive and Catchment Sensitive Farming in the UK. This research aims to create a dynamic, process-based, mathematical in-stream model to simulate the growth and competition of different vegetation types (macrophytes, phytoplankton and benthic algae) in rivers. The model, applied to the River Frome (Dorset, UK), captured well the seasonality of simulated vegetation types (suspended algae, macrophytes, epiphytes, sediment biofilm). Macrophyte results showed that local knowledge is important for explaining unusual changes in biomass. Fixed algae simulations indicated the need for the more detailed representation of various herbivorous grazer groups, however this would increase the model complexity, the number of model parameters and the required observation data to better define the model. The model results also highlighted that simulating only phytoplankton is insufficient in river systems, because the majority of the suspended algae have benthic origin in short retention time rivers. Therefore, there is a need for modelling tools that link the benthic and free-floating habitats.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The River Lugg has particular problems with high sediment loads that have resulted in detrimental impacts on ecology and fisheries. A new dynamic, process-based model of hydrology and sediments (INCA- SED) has been developed and applied to the River Lugg system using an extensive data set from 1995–2008. The model simulates sediment sources and sinks throughout the catchment and gives a good representation of the sediment response at 22 reaches along the River Lugg. A key question considered in using the model is the management of sediment sources so that concentrations and bed loads can be reduced in the river system. Altogether, five sediment management scenarios were selected for testing on the River Lugg, including land use change, contour tillage, hedging and buffer strips. Running the model with parameters altered to simulate these five scenarios produced some interesting results. All scenarios achieved some reduction in sediment levels, with the 40% land use change achieving the best result with a 19% reduction. The other scenarios also achieved significant reductions of between 7% and 9%. Buffer strips produce the best result at close to 9%. The results suggest that if hedge introduction, contour tillage and buffer strips were all applied, sediment reductions would total 24%, considerably improving the current sediment situation. We present a novel cost-effectiveness analysis of our results where we use percentage of land removed from production as our cost function. Given the minimal loss of land associated with contour tillage, hedges and buffer strips, we suggest that these management practices are the most cost-effective combination to reduce sediment loads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quadratic programming techniques were applied to household food consumption data in England and Wales to estimate likely changes in diet under healthy eating guidelines, and the consequences this would have on agriculture and land use in England and Wales. The first step entailed imposing nutrient restrictions on food consumption following dietary recommendations suggested by the UK Department of Health. The resulting diet was used, in a second step as a proxy for demand in agricultural commodities, to test the impact of such a scenario on food production and land use in England and Wales and the impacts of this on agricultural landscapes. Results of the diet optimisation indicated a large drop in consumption of foods rich in saturated fats and sugar, essentially cheese and sugar-based products, along with lesser cuts of fat and meat products. Conversely, consumption of fruit and vegetables, cereals, and flour would increase to meet dietary fibre recommendations. Such a shift in demand would dramatically affect production patterns: the financial net margin of England and Wales agriculture would rise, due to increased production of high market value and high economic margin crops. Some regions would, however, be negatively affected, mostly those dependent on beef cattle and sheep production that could not benefit from an increased demand for cereals and horticultural crops. The effects of these changes would also be felt in upstream industries, such as animal feed suppliers. While arable dominated landscapes would be little affected, pastoral landscapes would suffer through loss of grazing management and, possibly, land abandonment, especially in upland areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new fast, effective and practical model structure construction algorithm for a mixture of experts network system utilising only process data. The algorithm is based on a novel forward constrained regression procedure. Given a full set of the experts as potential model bases, the structure construction algorithm, formed on the forward constrained regression procedure, selects the most significant model base one by one so as to minimise the overall system approximation error at each iteration, while the gate parameters in the mixture of experts network system are accordingly adjusted so as to satisfy the convex constraints required in the derivation of the forward constrained regression procedure. The procedure continues until a proper system model is constructed that utilises some or all of the experts. A pruning algorithm of the consequent mixture of experts network system is also derived to generate an overall parsimonious construction algorithm. Numerical examples are provided to demonstrate the effectiveness of the new algorithms. The mixture of experts network framework can be applied to a wide variety of applications ranging from multiple model controller synthesis to multi-sensor data fusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A common problem in many data based modelling algorithms such as associative memory networks is the problem of the curse of dimensionality. In this paper, a new two-stage neurofuzzy system design and construction algorithm (NeuDeC) for nonlinear dynamical processes is introduced to effectively tackle this problem. A new simple preprocessing method is initially derived and applied to reduce the rule base, followed by a fine model detection process based on the reduced rule set by using forward orthogonal least squares model structure detection. In both stages, new A-optimality experimental design-based criteria we used. In the preprocessing stage, a lower bound of the A-optimality design criterion is derived and applied as a subset selection metric, but in the later stage, the A-optimality design criterion is incorporated into a new composite cost function that minimises model prediction error as well as penalises the model parameter variance. The utilisation of NeuDeC leads to unbiased model parameters with low parameter variance and the additional benefit of a parsimonious model structure. Numerical examples are included to demonstrate the effectiveness of this new modelling approach for high dimensional inputs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The modelling of a nonlinear stochastic dynamical processes from data involves solving the problems of data gathering, preprocessing, model architecture selection, learning or adaptation, parametric evaluation and model validation. For a given model architecture such as associative memory networks, a common problem in non-linear modelling is the problem of "the curse of dimensionality". A series of complementary data based constructive identification schemes, mainly based on but not limited to an operating point dependent fuzzy models, are introduced in this paper with the aim to overcome the curse of dimensionality. These include (i) a mixture of experts algorithm based on a forward constrained regression algorithm; (ii) an inherent parsimonious delaunay input space partition based piecewise local lineal modelling concept; (iii) a neurofuzzy model constructive approach based on forward orthogonal least squares and optimal experimental design and finally (iv) the neurofuzzy model construction algorithm based on basis functions that are Bézier Bernstein polynomial functions and the additive decomposition. Illustrative examples demonstrate their applicability, showing that the final major hurdle in data based modelling has almost been removed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modelling the interaction of terahertz(THz) radiation with biological tissueposes many interesting problems. THzradiation is neither obviously described byan electric field distribution or anensemble of photons and biological tissueis an inhomogeneous medium with anelectronic permittivity that is bothspatially and frequency dependent making ita complex system to model.A three-layer system of parallel-sidedslabs has been used as the system throughwhich the passage of THz radiation has beensimulated. Two modelling approaches havebeen developed a thin film matrix model anda Monte Carlo model. The source data foreach of these methods, taken at the sametime as the data recorded to experimentallyverify them, was a THz spectrum that hadpassed though air only.Experimental verification of these twomodels was carried out using athree-layered in vitro phantom. Simulatedtransmission spectrum data was compared toexperimental transmission spectrum datafirst to determine and then to compare theaccuracy of the two methods. Goodagreement was found, with typical resultshaving a correlation coefficient of 0.90for the thin film matrix model and 0.78 forthe Monte Carlo model over the full THzspectrum. Further work is underway toimprove the models above 1 THz.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses how the use of computer-based modelling tools has aided the design of a telemetry unit for use with oil well logging. With the aid of modern computer-based simulation techniques, the new design is capable of operating at data rates of 2.5 times faster than previous designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Remote sensing is the only practicable means to observe snow at large scales. Measurements from passive microwave instruments have been used to derive snow climatology since the late 1970’s, but the algorithms used were limited by the computational power of the era. Simplifications such as the assumption of constant snow properties enabled snow mass to be retrieved from the microwave measurements, but large errors arise from those assumptions, which are still used today. A better approach is to perform retrievals within a data assimilation framework, where a physically-based model of the snow properties can be used to produce the best estimate of the snow cover, in conjunction with multi-sensor observations such as the grain size, surface temperature, and microwave radiation. We have developed an existing snow model, SNOBAL, to incorporate mass and energy transfer of the soil, and to simulate the growth of the snow grains. An evaluation of this model is presented and techniques for the development of new retrieval systems are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article a simple and effective controller design is introduced for the Hammerstein systems that are identified based on observational input/output data. The nonlinear static function in the Hammerstein system is modelled using a B-spline neural network. The controller is composed by computing the inverse of the B-spline approximated nonlinear static function, and a linear pole assignment controller. The contribution of this article is the inverse of De Boor algorithm that computes the inverse efficiently. Mathematical analysis is provided to prove the convergence of the proposed algorithm. Numerical examples are utilised to demonstrate the efficacy of the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of reconstructing the (otherwise unknown) source and sink field of a tracer in a fluid is studied by developing and testing a simple tracer transport model of a single-level global atmosphere and a dynamic data assimilation system. The source/sink field (taken to be constant over a 10-day assimilation window) and initial tracer field are analysed together by assimilating imperfect tracer observations over the window. Experiments show that useful information about the source/sink field may be determined from relatively few observations when the initial tracer field is known very accurately a-priori, even when a-priori source/sink information is biased (the source/sink a-priori is set to zero). In this case each observation provides information about the source/sink field at positions upstream and the assimilation of many observations together can reasonably determine the location and strength of a test source.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two-dimensional flood inundation modelling is a widely used tool to aid flood risk management. In urban areas, the model spatial resolution required to represent flows through a typical street network often results in an impractical computational cost at the city scale. This paper presents the calibration and evaluation of a recently developed formulation of the LISFLOOD-FP model, which is more computationally efficient at these resolutions. Aerial photography was available for model evaluation on 3 days from the 24 to the 31 of July. The new formulation was benchmarked against the original version of the model at 20 and 40 m resolutions, demonstrating equally accurate simulation, given the evaluation data but at a 67 times faster computation time. The July event was then simulated at the 2 m resolution of the available airborne LiDAR DEM. This resulted in more accurate simulation of the floodplain drying dynamics compared with the coarse resolution models, although maximum inundation levels were simulated equally well at all resolutions tested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new approach to modelling flash floods in dryland catchments by integrating remote sensing and digital elevation model (DEM) data in a geographical information system (GIS). The spectral reflectance of channels affected by recent flash floods exhibit a marked increase, due to the deposition of fine sediments in these channels as the flood recedes. This allows the parts of a catchment that have been affected by a recent flood event to be discriminated from unaffected parts, using a time series of Landsat images. Using images of the Wadi Hudain catchment in southern Egypt, the hillslope areas contributing flow were inferred for different flood events. The SRTM3 DEM was used to derive flow direction, flow length, active channel cross-sectional areas and slope. The Manning Equation was used to estimate the channel flow velocities, and hence the time-area zones of the catchment. A channel reach that was active during a 1985 runoff event, that does not receive any tributary flow, was used to estimate a transmission loss rate of 7·5 mm h−1, given the maximum peak discharge estimate. Runoff patterns resulting from different flood events are quite variable; however the southern part of the catchment appears to have experienced more floods during the period of study (1984–2000), perhaps because the bedrock hillslopes in this area are more effective at runoff production than other parts of the catchment which are underlain by unconsolidated Quaternary sands and gravels. Due to high transmission loss, runoff generated within the upper reaches is rarely delivered to the alluvial fan and Shalateen city situated at the catchment outlet. The synthetic GIS-based time area zones, on their own, cannot be relied on to model the hydrographs reliably; physical parameters, such as rainfall intensity, distribution, and transmission loss, must also be considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review the procedures and challenges that must be considered when using geoid data derived from the Gravity and steady-state Ocean Circulation Explorer (GOCE) mission in order to constrain the circulation and water mass representation in an ocean 5 general circulation model. It covers the combination of the geoid information with timemean sea level information derived from satellite altimeter data, to construct a mean dynamic topography (MDT), and considers how this complements the time-varying sea level anomaly, also available from the satellite altimeter. We particularly consider the compatibility of these different fields in their spatial scale content, their temporal rep10 resentation, and in their error covariances. These considerations are very important when the resulting data are to be used to estimate ocean circulation and its corresponding errors. We describe the further steps needed for assimilating the resulting dynamic topography information into an ocean circulation model using three different operational fore15 casting and data assimilation systems. We look at methods used for assimilating altimeter anomaly data in the absence of a suitable geoid, and then discuss different approaches which have been tried for assimilating the additional geoid information. We review the problems that have been encountered and the lessons learned in order the help future users. Finally we present some results from the use of GRACE geoid in20 formation in the operational oceanography community and discuss the future potential gains that may be obtained from a new GOCE geoid.