27 resultados para Future applications
em CentAUR: Central Archive University of Reading - UK
Resumo:
Global hydrological models (GHMs) model the land surface hydrologic dynamics of continental-scale river basins. Here we describe one such GHM, the Macro-scale - Probability-Distributed Moisture model.09 (Mac-PDM.09). The model has undergone a number of revisions since it was last applied in the hydrological literature. This paper serves to provide a detailed description of the latest version of the model. The main revisions include the following: (1) the ability for the model to be run for n repetitions, which provides more robust estimates of extreme hydrological behaviour, (2) the ability of the model to use a gridded field of coefficient of variation (CV) of daily rainfall for the stochastic disaggregation of monthly precipitation to daily precipitation, and (3) the model can now be forced with daily input climate data as well as monthly input climate data. We demonstrate the effects that each of these three revisions has on simulated runoff relative to before the revisions were applied. Importantly, we show that when Mac-PDM.09 is forced with monthly input data, it results in a negative runoff bias relative to when daily forcings are applied, for regions of the globe where the day-to-day variability in relative humidity is high. The runoff bias can be up to - 80% for a small selection of catchments but the absolute magnitude of the bias may be small. As such, we recommend future applications of Mac-PDM.09 that use monthly climate forcings acknowledge the bias as a limitation of the model. The performance of Mac-PDM.09 is evaluated by validating simulated runoff against observed runoff for 50 catchments. We also present a sensitivity analysis that demonstrates that simulated runoff is considerably more sensitive to method of PE calculation than to perturbations in soil moisture and field capacity parameters.
Resumo:
We solve eight partial-differential, two-dimensional, nonlinear mean field equations, which describe the dynamics of large populations of cortical neurons. Linearized versions of these equations have been used to generate the strong resonances observed in the human EEG, in particular the α-rhythm (8–), with physiologically plausible parameters. We extend these results here by numerically solving the full equations on a cortex of realistic size, which receives appropriately “colored” noise as extra-cortical input. A brief summary of the numerical methods is provided. As an outlook to future applications, we explain how the effects of GABA-enhancing general anaesthetics can be simulated and present first results.
Resumo:
A global river routing scheme coupled to the ECMWF land surface model is implemented and tested within the framework of the Global Soil Wetness Project II, to evaluate the feasibility of modelling global river runoff at a daily time scale. The exercise is designed to provide benchmark river runoff predictions needed to verify the land surface model. Ten years of daily runoff produced by the HTESSEL land surface scheme is input into the TRIP2 river routing scheme in order to generate daily river runoff. These are then compared to river runoff observations from the Global Runoff Data Centre (GRDC) in order to evaluate the potential and the limitations. A notable source of inaccuracy is bias between observed and modelled discharges which is not primarily due to the modelling system but instead of to the forcing and quality of observations and seems uncorrelated to the river catchment size. A global sensitivity analysis and Generalised Likelihood Uncertainty Estimation (GLUE) uncertainty analysis are applied to the global routing model. The ground water delay parameter is identified as being the most sensitive calibration parameter. Significant uncertainties are found in results, and those due to parameterisation of the routing model are quantified. The difficulty involved in parameterising global river discharge models is discussed. Detailed river runoff simulations are shown for the river Danube, which match well observed river runoff in upstream river transects. Results show that although there are errors in runoff predictions, model results are encouraging and certainly indicative of useful runoff predictions, particularly for the purpose of verifying the land surface scheme hydrologicly. Potential of this modelling system on future applications such as river runoff forecasting and climate impact studies is highlighted. Copyright © 2009 Royal Meteorological Society.
Resumo:
Smart grid research has tended to be compartmentalised, with notable contributions from economics, electrical engineering and science and technology studies. However, there is an acknowledged and growing need for an integrated systems approach to the evaluation of smart grid initiatives. The capacity to simulate and explore smart grid possibilities on various scales is key to such an integrated approach but existing models – even if multidisciplinary – tend to have a limited focus. This paper describes an innovative and flexible framework that has been developed to facilitate the simulation of various smart grid scenarios and the interconnected social, technical and economic networks from a complex systems perspective. The architecture is described and related to realised examples of its use, both to model the electricity system as it is today and to model futures that have been envisioned in the literature. Potential future applications of the framework are explored, along with its utility as an analytic and decision support tool for smart grid stakeholders.
Resumo:
The self-assembly in aqueous solution of three novel telechelic conjugates comprising a central hydrophilic polymer and short (trimeric or pentameric) tyrosine end-caps has been investigated. Two of the conjugates have a central poly(oxyethylene) (polyethylene oxide, PEO) central block with different molar masses. The other conjugate has a central poly(l-alanine) (PAla) sequence in a purely amino-acid based conjugate. All three conjugates self-assemble into β-sheet based fibrillar structures, although the fibrillar morphology revealed by cryogenic-TEM is distinct for the three polymers—in particular the Tyr5-PEO6k-Tyr5 forms a population of short straight fibrils in contrast to the more diffuse fibril aggregates observed for Tyr5-PEO2k-Tyr5 and Tyr3-PAla-Tyr3. Hydrogel formation was not observed for these samples (in contrast to prior work on related systems) up to quite high concentrations, showing that it is possible to prepare solutions of peptide–polymer-peptide conjugates with hydrophobic end-caps without conformational constraints associated with hydrogelation. The Tyr5-PEO6k-Tyr5 shows significant PEO crystallization upon drying in contrast to the Tyr5-PEO2k-Tyr5 conjugate. Our findings point to the remarkable ability of short hydrophobic peptide end groups to modulate the self-assembly properties of polymers in solution in model peptide-capped “associative polymers”. Retention of fluidity at high conjugate concentration may be valuable in potential future applications of these conjugates as bioresponsive or biocompatible materials, for example exploiting the enzyme-responsiveness of the tyrosine end-groups
Resumo:
Mainframes, corporate and central servers are becoming information servers. The requirement for more powerful information servers is the best opportunity to exploit the potential of parallelism. ICL recognized the opportunity of the 'knowledge spectrum' namely to convert raw data into information and then into high grade knowledge. Parallel Processing and Data Management Its response to this and to the underlying search problems was to introduce the CAFS retrieval engine. The CAFS product demonstrates that it is possible to move functionality within an established architecture, introduce a different technology mix and exploit parallelism to achieve radically new levels of performance. CAFS also demonstrates the benefit of achieving this transparently behind existing interfaces. ICL is now working with Bull and Siemens to develop the information servers of the future by exploiting new technologies as available. The objective of the joint Esprit II European Declarative System project is to develop a smoothly scalable, highly parallel computer system, EDS. EDS will in the main be an SQL server and an information server. It will support the many data-intensive applications which the companies foresee; it will also support application-intensive and logic-intensive systems.
Resumo:
The MarQUEST (Marine Biogeochemistry and Ecosystem Modelling Initiative in QUEST) project was established to develop improved descriptions of marine biogeochemistry, suited for the next generation of Earth system models. We review progress in these areas providing insight on the advances that have been made as well as identifying remaining key outstanding gaps for the development of the marine component of next generation Earth system models. The following issues are discussed and where appropriate results are presented; the choice of model structure, scaling processes from physiology to functional types, the ecosystem model sensitivity to changes in the physical environment, the role of the coastal ocean and new methods for the evaluation and comparison of ecosystem and biogeochemistry models. We make recommendations as to where future investment in marine ecosystem modelling should be focused, highlighting a generic software framework for model development, improved hydrodynamic models, and better parameterisation of new and existing models, reanalysis tools and ensemble simulations. The final challenge is to ensure that experimental/observational scientists are stakeholders in the models and vice versa.
Resumo:
In the Biodiversity World (BDW) project we have created a flexible and extensible Web Services-based Grid environment for biodiversity researchers to solve problems in biodiversity and analyse biodiversity patterns. In this environment, heterogeneous and globally distributed biodiversity-related resources such as data sets and analytical tools are made available to be accessed and assembled by users into workflows to perform complex scientific experiments. One such experiment is bioclimatic modelling of the geographical distribution of individual species using climate variables in order to predict past and future climate-related changes in species distribution. Data sources and analytical tools required for such analysis of species distribution are widely dispersed, available on heterogeneous platforms, present data in different formats and lack interoperability. The BDW system brings all these disparate units together so that the user can combine tools with little thought as to their availability, data formats and interoperability. The current Web Servicesbased Grid environment enables execution of the BDW workflow tasks in remote nodes but with a limited scope. The next step in the evolution of the BDW architecture is to enable workflow tasks to utilise computational resources available within and outside the BDW domain. We describe the present BDW architecture and its transition to a new framework which provides a distributed computational environment for mapping and executing workflows in addition to bringing together heterogeneous resources and analytical tools.
Resumo:
MS is an important analytical tool in clinical proteomics, primarily in the disease-specific discovery identification and characterisation of proteomic biomarkers and patterns. MS-based proteomics is increasingly used in clinical validation and diagnostic method development. The latter departs from the typical application of MS-based proteomics by exchanging some of the high performance of analysis for the throughput, robustness and simplicity required for clinical diagnostics. Although conventional MS-based proteomics has become an important field in clinical applications, some of the most recent MS technologies have not yet been extensively applied in clinical proteomics. in this review, we will describe the current state of MS in clinical proteomics and look to the future of this field.
Resumo:
Mathematical models have been vitally important in the development of technologies in building engineering. A literature review identifies that linear models are the most widely used building simulation models. The advent of intelligent buildings has added new challenges in the application of the existing models as an intelligent building requires learning and self-adjusting capabilities based on environmental and occupants' factors. It is therefore argued that the linearity is an impropriate basis for any model of either complex building systems or occupant behaviours for control or whatever purpose. Chaos and complexity theory reflects nonlinear dynamic properties of the intelligent systems excised by occupants and environment and has been used widely in modelling various engineering, natural and social systems. It is proposed that chaos and complexity theory be applied to study intelligent buildings. This paper gives a brief description of chaos and complexity theory and presents its current positioning, recent developments in building engineering research and future potential applications to intelligent building studies, which provides a bridge between chaos and complexity theory and intelligent building research.
Resumo:
This article is the second part of a review of the historical evolution of mathematical models applied in the development of building technology. The first part described the current state of the art and contrasted various models with regard to the applications to conventional buildings and intelligent buildings. It concluded that mathematical techniques adopted in neural networks, expert systems, fuzzy logic and genetic models, that can be used to address model uncertainty, are well suited for modelling intelligent buildings. Despite the progress, the possible future development of intelligent buildings based on the current trends implies some potential limitations of these models. This paper attempts to uncover the fundamental limitations inherent in these models and provides some insights into future modelling directions, with special focus on the techniques of semiotics and chaos. Finally, by demonstrating an example of an intelligent building system with the mathematical models that have been developed for such a system, this review addresses the influences of mathematical models as a potential aid in developing intelligent buildings and perhaps even more advanced buildings for the future.
Resumo:
Is the human body a suitable place for a microchip? Such discussion is no longer hypothetical - in fact in reality it has not been so for some years. Restorative devices such as pacemakers and cochlear implants have become well established, yet these sophisticated devices form notably intimate links between technology and the body. More recent developments in engineering technologies have meant that the integration of silicon with biology is now reaching new levels - with devices which interact directly with the brain. As medical technologies continue to advance, their potential benefits for human enhancement will become increasingly attractive, and so we need to seriously consider where this may take us. In this paper, an attempt is made to demonstrate that, in the medical context, the foundations of more advanced implantable enhancement technologies are already notably progressed, and that they are becoming more science fact than is widely considered. A number of wider moral, ethical and legal issues stem from enhancement applications and it is difficult to foresee the social consequences, the fundamental changes on our very conception of self and the impact on our identity of adoption long term. As a result, it is necessary to acknowledge the possibilities and is timely to have debate to address the wider implications these possibilities may bring.
Resumo:
Space applications demand the need for building reliable systems. Autonomic computing defines such reliable systems as self-managing systems. The work reported in this paper combines agent-based and swarm robotic approaches leading to swarm-array computing, a novel technique to achieve self-managing distributed parallel computing systems. Two swarm-array computing approaches based on swarms of computational resources and swarms of tasks are explored. FPGA is considered as the computing system. The feasibility of the two proposed approaches that binds the computing system and the task together is simulated on the SeSAm multi-agent simulator.
Resumo:
Space applications demand the need for building reliable systems. Autonomic computing defines such reliable systems as self-managing systems. The work reported in this paper combines agent-based and swarm robotic approaches leading to swarm-array computing, a novel technique to achieve self-managing distributed parallel computing systems. Two swarm-array computing approaches based on swarms of computational resources and swarms of tasks are explored. FPGA is considered as the computing system. The feasibility of the two proposed approaches that binds the computing system and the task together is simulated on the SeSAm multi-agent simulator.