860 resultados para conceptual data modelling
Resumo:
Across Europe, elevated phosphorus (P) concentrations in lowland rivers have made them particularly susceptible to eutrophication. This is compounded in southern and central UK by increasing pressures on water resources, which may be further enhanced by the potential effects of climate change. The EU Water Framework Directive requires an integrated approach to water resources management at the catchment scale and highlights the need for modelling tools that can distinguish relative contributions from multiple nutrient sources and are consistent with the information content of the available data. Two such models are introduced and evaluated within a stochastic framework using daily flow and total phosphorus concentrations recorded in a clay catchment typical of many areas of the lowland UK. Both models disaggregate empirical annual load estimates, derived from land use data, as a function of surface/near surface runoff, generated using a simple conceptual rainfall-runoff model. Estimates of the daily load from agricultural land, together with those from baseflow and point sources, feed into an in-stream routing algorithm. The first model assumes constant concentrations in runoff via surface/near surface pathways and incorporates an additional P store in the river-bed sediments, depleted above a critical discharge, to explicitly simulate resuspension. The second model, which is simpler, simulates P concentrations as a function of surface/near surface runoff, thus emphasising the influence of non-point source loads during flow peaks and mixing of baseflow and point sources during low flows. The temporal consistency of parameter estimates and thus the suitability of each approach is assessed dynamically following a new approach based on Monte-Carlo analysis. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
Recent severe flooding in the UK has highlighted the need for better information on flood risk, increasing the pressure on engineers to enhance the capabilities of computer models for flood prediction. This paper evaluates the benefits to be gained from the use of remotely sensed data to support flood modelling. The remotely sensed data available can be used either to produce high-resolution digital terrain models (DTMs) (light detection and ranging (Lidar) data), or to generate accurate inundation mapping of past flood events (airborne synthetic aperture radar (SAR) data and aerial photography). The paper reports on the modelling of real flood events that occurred at two UK sites on the rivers Severn and Ouse. At these sites a combination of remotely sensed data and recorded hydrographs was available. It is concluded first that light detection and ranging Lidar generated DTMs support the generation of considerably better models and enhance the visualisation of model results and second that flood outlines obtained from airborne SAR or aerial images help develop an appreciation of the hydraulic behaviour of important model components, and facilitate model validation. The need for further research is highlighted by a number of limitations, namely: the difficulties in obtaining an adequate representation of hydraulically important features such as embankment crests and walls; uncertainties in the validation data; and difficulties in extracting flood outlines from airborne SAR images in urban areas.
Resumo:
Many time series are measured monthly, either as averages or totals, and such data often exhibit seasonal variability-the values of the series are consistently larger for some months of the year than for others. A typical series of this type is the number of deaths each month attributed to SIDS (Sudden Infant Death Syndrome). Seasonality can be modelled in a number of ways. This paper describes and discusses various methods for modelling seasonality in SIDS data, though much of the discussion is relevant to other seasonally varying data. There are two main approaches, either fitting a circular probability distribution to the data, or using regression-based techniques to model the mean seasonal behaviour. Both are discussed in this paper.
Resumo:
To construct Biodiversity richness maps from Environmental Niche Models (ENMs) of thousands of species is time consuming. A separate species occurrence data pre-processing phase enables the experimenter to control test AUC score variance due to species dataset size. Besides, removing duplicate occurrences and points with missing environmental data, we discuss the need for coordinate precision, wide dispersion, temporal and synonymity filters. After species data filtering, the final task of a pre-processing phase should be the automatic generation of species occurrence datasets which can then be directly ’plugged-in’ to the ENM. A software application capable of carrying out all these tasks will be a valuable time-saver particularly for large scale biodiversity studies.
Resumo:
Health care providers, purchasers and policy makers need to make informed decisions regarding the provision of cost-effective care. When a new health care intervention is to be compared with the current standard, an economic evaluation alongside an evaluation of health benefits provides useful information for the decision making process. We consider the information on cost-effectiveness which arises from an individual clinical trial comparing the two interventions. Recent methods for conducting a cost-effectiveness analysis for a clinical trial have focused on the net benefit parameter. The net benefit parameter, a function of costs and health benefits, is positive if the new intervention is cost-effective compared with the standard. In this paper we describe frequentist and Bayesian approaches to cost-effectiveness analysis which have been suggested in the literature and apply them to data from a clinical trial comparing laparoscopic surgery with open mesh surgery for the repair of inguinal hernias. We extend the Bayesian model to allow the total cost to be divided into a number of different components. The advantages and disadvantages of the different approaches are discussed. In January 2001, NICE issued guidance on the type of surgery to be used for inguinal hernia repair. We discuss our example in the light of this information. Copyright © 2003 John Wiley & Sons, Ltd.
Resumo:
Analysis of X-ray powder data for the melt-crystallisable aromatic poly(thioether thioether ketone) [-S-Ar-S-Ar-CO-Ar](n), ('PTTK', Ar= 1,4-phenylene), reveals that it adopts a crystal structure very different from that established for its ether-analogue PEEK. Molecular modelling and diffraction-simulation studies of PTTK show that the structure of this polymer is analogous to that of melt-crystallised poly(thioetherketone) [-SAr-CO-Ar](n) in which the carbonyl linkages in symmetry-related chains are aligned anti-parallel to one another. and that these bridging units are crystallographically interchangeable. The final model for the crystal structure of PTTK is thus disordered, in the monoclinic space group 121a (two chains per unit cell), with cell dimensions a = 7.83, b = 6.06, c = 10.35 angstrom, beta = 93.47 degrees. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Many G protein-coupled receptors have been shown to exist as oligomers, but the oligomerization state and the effects of this on receptor function are unclear. For some G protein-coupled receptors, in ligand binding assays, different radioligands provide different maximal binding capacities. Here we have developed mathematical models for co-expressed dimeric and tetrameric species of receptors. We have considered models where the dimers and tetramers are in equilibrium and where they do not interconvert and we have also considered the potential influence of the ligands on the degree of oligomerization. By analogy with agonist efficacy, we have considered ligands that promote, inhibit or have no effect on oligomerization. Cell surface receptor expression and the intrinsic capacity of receptors to oligomerize are quantitative parameters of the equations. The models can account for differences in the maximal binding capacities of radioligands in different preparations of receptors and provide a conceptual framework for simulation and data fitting in complex oligomeric receptor situations.
Resumo:
Purpose – While Freeman's stakeholder management approach has attracted much attention from both scholars and practitioners, little empirical work has considered the interconnectedness of organisational perspectives and stakeholder perspectives. The purpose of this paper is to respond to this gap by developing and empirically testing a bi-directional model of organisation/stakeholder relationships. Design/methodology/approach – A conceptual framework is developed that integrates how stakeholders are affected by organisations with how they affect organisations. Quantitative data relating to both sides of the relationship are obtained from 700 customers of a European service organisation and analysed using partial least squares structural equation modelling technique. Findings – The findings provide empirical support for the notion of mutual dependency between organisations and stakeholders as advocated by stakeholder theorists. The results suggest that the way stakeholders relate to organisations is dependent on how organisations relate to stakeholders. Originality/value – The study is original on two fronts: first, it provides a framework and process that can be used by researchers to model bi-directional research with other stakeholder groups and in different contexts. Second, the study presents an example application of bi-directional research by empirically linking organisational and stakeholder expectations in the case of customers of a UK service organisation.
Resumo:
Electrospinning is a technique employed to produce nanoscale to microscale sized fibres by the application of a high voltage to a spinneret containing a polymer solution. Here we examine how small angle neutron scattering data can be modelled to analyse the polymer chain conformation. We prepared 1:1 blends of deuterated and hydrogenated atactic-polystyrene fibres from solutions in N, N-Dimethylformamide and Methyl Ethyl Ketone. The fibres themselves often contain pores or voiding within the internal structure on the length scales that can interfere with scattering experiments. A model to fit the scattering data in order to obtain values for the radius of gyration of the polymer molecules within the fibres has been developed, that includes in the scattering from the voids. Using this model we find that the radius of gyration is 20% larger than in the bulk state and the chains are slightly extended parallel to the fibre axis.
Resumo:
In a world of almost permanent and rapidly increasing electronic data availability, techniques of filtering, compressing, and interpreting this data to transform it into valuable and easily comprehensible information is of utmost importance. One key topic in this area is the capability to deduce future system behavior from a given data input. This book brings together for the first time the complete theory of data-based neurofuzzy modelling and the linguistic attributes of fuzzy logic in a single cohesive mathematical framework. After introducing the basic theory of data-based modelling, new concepts including extended additive and multiplicative submodels are developed and their extensions to state estimation and data fusion are derived. All these algorithms are illustrated with benchmark and real-life examples to demonstrate their efficiency. Chris Harris and his group have carried out pioneering work which has tied together the fields of neural networks and linguistic rule-based algortihms. This book is aimed at researchers and scientists in time series modeling, empirical data modeling, knowledge discovery, data mining, and data fusion.