946 resultados para Towards Seamless Integration of Geoscience Models and Data
Phosphorus dynamics and export in streams draining micro-catchments: Development of empirical models
Resumo:
Annual total phosphorus (TP) export data from 108 European micro-catchments were analyzed against descriptive catchment data on climate (runoff), soil types, catchment size, and land use. The best possible empirical model developed included runoff, proportion of agricultural land and catchment size as explanatory variables but with a low explanation of the variance in the dataset (R-2 = 0.37). Improved country specific empirical models could be developed in some cases. The best example was from Norway where an analysis of TP-export data from 12 predominantly agricultural micro-catchments revealed a relationship explaining 96% of the variance in TP-export. The explanatory variables were in this case soil-P status (P-AL), proportion of organic soil, and the export of suspended sediment. Another example is from Denmark where an empirical model was established for the basic annual average TP-export from 24 catchments with percentage sandy soils, percentage organic soils, runoff, and application of phosphorus in fertilizer and animal manure as explanatory variables (R-2 = 0.97).
Resumo:
A comparison of the models of Vitti et al. (2000, J. Anim. Sci. 78, 2706-2712) and Fernandez (1995c, Livest. Prod. Sci. 41, 255-261) was carried out using two data sets on growing pigs as input. The two models compared were based on similar basic principles, although their aims and calculations differed. The Vitti model employs the rate:state formalism and describes phosphorus (P) flow between four pools representing P content in gut, blood, bone and soft tissue in growing goats. The Fernandez model describes flow and fractional recirculation between P pools in gut, blood and bone in growing pigs. The results from both models showed similar trends for P absorption from gut to blood and net retention in bone with increasing P intake, with the exception of the 65 kg results from Date Set 2 calculated using the FernAndez model. Endogenous loss from blood back to gut increased faster with increasing P intake in the FernAndez than in the Vitti model for Data Set 1. However, for Data Set 2, endogenous loss increased with increasing P intake using the Vitti model, but decreased when calculated using the FernAndez model. Incorporation of P into bone was not influenced by intake in the FernAndez model, while in the Vitti model there was an increasing trend. The FernAndez model produced a pattern of decreasing resorption in bone with increasing P intake, with one of the data sets, which was not observed when using the Vitti model. The pigs maintained their P homeostasis in blood by regulation of P excretion in urine. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
1. Demographic models are assuming an important role in management decisions for endangered species. Elasticity analysis and scope for management analysis are two such applications. Elasticity analysis determines the vital rates that have the greatest impact on population growth. Scope for management analysis examines the effects that feasible management might have on vital rates and population growth. Both methods target management in an attempt to maximize population growth. 2. The Seychelles magpie robin Copsychus sechellarum is a critically endangered island endemic, the population of which underwent significant growth in the early 1990s following the implementation of a recovery programme. We examined how the formal use of elasticity and scope for management analyses might have shaped management in the recovery programme, and assessed their effectiveness by comparison with the actual population growth achieved. 3. The magpie robin population doubled from about 25 birds in 1990 to more than 50 by 1995. A simple two-stage demographic model showed that this growth was driven primarily by a significant increase in the annual survival probability of first-year birds and an increase in the birth rate. Neither the annual survival probability of adults nor the probability of a female breeding at age 1 changed significantly over time. 4. Elasticity analysis showed that the annual survival probability of adults had the greatest impact on population growth. There was some scope to use management to increase survival, but because survival rates were already high (> 0.9) this had a negligible effect on population growth. Scope for management analysis showed that significant population growth could have been achieved by targeting management measures at the birth rate and survival probability of first-year birds, although predicted growth rates were lower than those achieved by the recovery programme when all management measures were in place (i.e. 1992-95). 5. Synthesis and applications. We argue that scope for management analysis can provide a useful basis for management but will inevitably be limited to some extent by a lack of data, as our study shows. This means that identifying perceived ecological problems and designing management to alleviate them must be an important component of endangered species management. The corollary of this is that it will not be possible or wise to consider only management options for which there is a demonstrable ecological benefit. Given these constraints, we see little role for elasticity analysis because, when data are available, a scope for management analysis will always be of greater practical value and, when data are lacking, precautionary management demands that as many perceived ecological problems as possible are tackled.
Resumo:
The aim of this review paper is to present experimental methodologies and the mathematical approaches used to determine effective diffusivities of solutes in food materials. The paper commences by describing the diffusion phenomena related to solute mass transfer in foods and effective diffusivities. It then focuses on the mathematical formulation for the calculation of effective diffusivities considering different diffusion models based on Fick's second law of diffusion. Finally, experimental considerations for effective diffusivity determination are elucidated primarily based on the acquirement of a series of solute content versus time curves appropriate to the equation model chosen. Different factors contributing to the determination of the effective diffusivities such as the structure of food material, temperature, diffusion solvent, agitation, sampling, concentration and different techniques used are considered. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
Essential oils have been widely used in traditional medicine for the eradication of lice, including head lice, but due to the variability of their constitution the effects may not be reproducible. In an attempt to assess the contribution of their component monoterpenoids, a range of common individual compounds were tested in in vitro toxicity model against both human lice (Pediculus humanus, an accepted model of head lice lethality) and their eggs, at different concentrations. No detailed study into the relative potencies of their constituent terpenoids has so far been published. Adult lice were observed for lack of response to stimuli over 3 h and the LT50 calculated, and the percentage of eggs failing to hatch was used to generate ovicidal activity data. A ranking was compiled for adult lice and partially for eggs, enabling structure-activity relationships to be assessed for lethality to both, and showed that, for activity in both life-cycle stages, different structural criteria were required. (+)-Terpinen-4-ol was the most effective compound against adult lice, followed by other mono-oxygenated monocyclic compounds, whereas nerolidol was particularly lethal to eggs, but ineffective against adult lice. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
In this work a method for building multiple-model structures is presented. A clustering algorithm that uses data from the system is employed to define the architecture of the multiple-model, including the size of the region covered by each model, and the number of models. A heating ventilation and air conditioning system is used as a testbed of the proposed method.
Resumo:
Analyzes the use of linear and neural network models for financial distress classification, with emphasis on the issues of input variable selection and model pruning. A data-driven method for selecting input variables (financial ratios, in this case) is proposed. A case study involving 60 British firms in the period 1997-2000 is used for illustration. It is shown that the use of the Optimal Brain Damage pruning technique can considerably improve the generalization ability of a neural model. Moreover, the set of financial ratios obtained with the proposed selection procedure is shown to be an appropriate alternative to the ratios usually employed by practitioners.
Resumo:
The performance of flood inundation models is often assessed using satellite observed data; however these data have inherent uncertainty. In this study we assess the impact of this uncertainty when calibrating a flood inundation model (LISFLOOD-FP) for a flood event in December 2006 on the River Dee, North Wales, UK. The flood extent is delineated from an ERS-2 SAR image of the event using an active contour model (snake), and water levels at the flood margin calculated through intersection of the shoreline vector with LiDAR topographic data. Gauged water levels are used to create a reference water surface slope for comparison with the satellite-derived water levels. Residuals between the satellite observed data points and those from the reference line are spatially clustered into groups of similar values. We show that model calibration achieved using pattern matching of observed and predicted flood extent is negatively influenced by this spatial dependency in the data. By contrast, model calibration using water elevations produces realistic calibrated optimum friction parameters even when spatial dependency is present. To test the impact of removing spatial dependency a new method of evaluating flood inundation model performance is developed by using multiple random subsamples of the water surface elevation data points. By testing for spatial dependency using Moran’s I, multiple subsamples of water elevations that have no significant spatial dependency are selected. The model is then calibrated against these data and the results averaged. This gives a near identical result to calibration using spatially dependent data, but has the advantage of being a statistically robust assessment of model performance in which we can have more confidence. Moreover, by using the variations found in the subsamples of the observed data it is possible to assess the effects of observational uncertainty on the assessment of flooding risk.
Resumo:
Leisure is in the vanguard of a social and cultural revolution which is replacing the former East/West political bipolarity with a globalised economic system in which the new Europe has a central rôle. Within this revolution, leisure, including recreation, culture and tourism, is constructed as the epitome of successful capitalist development; the very legitimisation of the global transmogrification from a production to a consumption orientation. While acting as a direct encouragement to the political transformation in many eastern European states, it is uncertain how the issue of leisure policy is being handled, given its centrality to the new economic order. This paper therefore examines the experience of western Europe, considering in particular the degree to which the newly-created Department of National Heritage in the UK provides a potential model for leisure development and policy integration in the new Europe. Despite an official rhetoric of support and promotion of leisure activities, reflecting the growing economic significance of tourism and the positive relationship between leisure provision and regional economic development, the paper establishes that in the place of the traditional rôle of the state in promoting leisure interests, the introduction of the Department has signified a shift to the use of leisure to promote the Government's interests, particularly in regenerating citizen rights claims towards the market. While an institution such as the Department of National Heritage may have relevance to emerging states as a element in the maintenance of political hegemony, therefore, it is questionable how far it can be viewed as a promoter or protector of leisure as a signifier of a newly-won political, economic and cultural freedom throughout Europe.
Resumo:
Current variability of precipitation (P) and its response to surface temperature (T) are analysed using coupled(CMIP5) and atmosphere-only (AMIP5) climate model simulations and compared with observational estimates. There is striking agreement between Global Precipitation Climatology Project (GPCP) observed and AMIP5 simulated P anomalies over land both globally and in the tropics suggesting that prescribed sea surface temperature and realistic radiative forcings are sufficient for simulating the interannual variability in continental P. Differences between the observed and simulated P variability over the ocean, originate primarily from the wet tropical regions, in particular the western Pacific, but are reduced slightly after 1995. All datasets show positive responses of P to T globally of around 2 %/K for simulations and 3-4 %/K in GPCP observations but model responses over the tropical oceans are around 3 times smaller than GPCP over the period 1988-2005. The observed anticorrelation between land and ocean P, linked with El Niño Southern Oscillation, is captured by the simulations. All data sets over the tropical ocean show a tendency for wet regions to become wetter and dry regions drier with warming. Over the wet region (75% precipitation percentile), the precipitation response is ~13-15%/K for GPCP and ~5%/K for models while trends in P are 2.4%/decade for GPCP, 0.6% /decade for CMIP5 and 0.9%/decade for AMIP5 suggesting that models are underestimating the precipitation responses or a deficiency exists in the satellite datasets.
Resumo:
Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.
Resumo:
1. Nutrient concentrations (particularly N and P) determine the extent to which water bodies are or may become eutrophic. Direct determination of nutrient content on a wide scale is labour intensive but the main sources of N and P are well known. This paper describes and tests an export coefficient model for prediction of total N and total P from: (i) land use, stock headage and human population; (ii) the export rates of N and P from these sources; and (iii) the river discharge. Such a model might be used to forecast the effects of changes in land use in the future and to hindcast past water quality to establish comparative or baseline states for the monitoring of change. 2. The model has been calibrated against observed data for 1988 and validated against sets of observed data for a sequence of earlier years in ten British catchments varying from uplands through rolling, fertile lowlands to the flat topography of East Anglia. 3. The model predicted total N and total P concentrations with high precision (95% of the variance in observed data explained). It has been used in two forms: the first on a specific catchment basis; the second for a larger natural region which contains the catchment with the assumption that all catchments within that region will be similar. Both models gave similar results with little loss of precision in the latter case. This implies that it will be possible to describe the overall pattern of nutrient export in the UK with only a fraction of the effort needed to carry out the calculations for each individual water body. 4. Comparison between land use, stock headage, population numbers and nutrient export for the ten catchments in the pre-war year of 1931, and for 1970 and 1988 show that there has been a substantial loss of rough grazing to fertilized temporary and permanent grasslands, an increase in the hectarage devoted to arable, consistent increases in the stocking of cattle and sheep and a marked movement of humans to these rural catchments. 5. All of these trends have increased the flows of nutrients with more than a doubling of both total N and total P loads during the period. On average in these rural catchments, stock wastes have been the greatest contributors to both N and P exports, with cultivation the next most important source of N and people of P. Ratios of N to P were high in 1931 and remain little changed so that, in these catchments, phosphorus continues to be the nutrient most likely to control algal crops in standing waters supplied by the rivers studied.
Resumo:
Platelets in the circulation are triggered by vascular damage to activate, aggregate and form a thrombus that prevents excessive blood loss. Platelet activation is stringently regulated by intracellular signalling cascades, which when activated inappropriately lead to myocardial infarction and stroke. Strategies to address platelet dysfunction have included proteomics approaches which have lead to the discovery of a number of novel regulatory proteins of potential therapeutic value. Global analysis of platelet proteomes may enhance the outcome of these studies by arranging this information in a contextual manner that recapitulates established signalling complexes and predicts novel regulatory processes. Platelet signalling networks have already begun to be exploited with interrogation of protein datasets using in silico methodologies that locate functionally feasible protein clusters for subsequent biochemical validation. Characterization of these biological systems through analysis of spatial and temporal organization of component proteins is developing alongside advances in the proteomics field. This focused review highlights advances in platelet proteomics data mining approaches that complement the emerging systems biology field. We have also highlighted nucleated cell types as key examples that can inform platelet research. Therapeutic translation of these modern approaches to understanding platelet regulatory mechanisms will enable the development of novel anti-thrombotic strategies.
Resumo:
Glycogen synthase kinase 3 (GSK3, of which there are two isoforms, GSK3alpha and GSK3beta) was originally characterized in the context of regulation of glycogen metabolism, though it is now known to regulate many other cellular processes. Phosphorylation of GSK3alpha(Ser21) and GSK3beta(Ser9) inhibits their activity. In the heart, emphasis has been placed particularly on GSK3beta, rather than GSK3alpha. Importantly, catalytically-active GSK3 generally restrains gene expression and, in the heart, catalytically-active GSK3 has been implicated in anti-hypertrophic signalling. Inhibition of GSK3 results in changes in the activities of transcription and translation factors in the heart and promotes hypertrophic responses, and it is generally assumed that signal transduction from hypertrophic stimuli to GSK3 passes primarily through protein kinase B/Akt (PKB/Akt). However, recent data suggest that the situation is far more complex. We review evidence pertaining to the role of GSK3 in the myocardium and discuss effects of genetic manipulation of GSK3 activity in vivo. We also discuss the signalling pathways potentially regulating GSK3 activity and propose that, depending on the stimulus, phosphorylation of GSK3 is independent of PKB/Akt. Potential GSK3 substrates studied in relation to myocardial hypertrophy include nuclear factors of activated T cells, beta-catenin, GATA4, myocardin, CREB, and eukaryotic initiation factor 2Bvarepsilon. These and other transcription factor substrates putatively important in the heart are considered. We discuss whether cardiac pathologies could be treated by therapeutic intervention at the GSK3 level but conclude that any intervention would be premature without greater understanding of the precise role of GSK3 in cardiac processes.
Resumo:
The goal of the Chemistry‐Climate Model Validation (CCMVal) activity is to improve understanding of chemistry‐climate models (CCMs) through process‐oriented evaluation and to provide reliable projections of stratospheric ozone and its impact on climate. An appreciation of the details of model formulations is essential for understanding how models respond to the changing external forcings of greenhouse gases and ozonedepleting substances, and hence for understanding the ozone and climate forecasts produced by the models participating in this activity. Here we introduce and review the models used for the second round (CCMVal‐2) of this intercomparison, regarding the implementation of chemical, transport, radiative, and dynamical processes in these models. In particular, we review the advantages and problems associated with approaches used to model processes of relevance to stratospheric dynamics and chemistry. Furthermore, we state the definitions of the reference simulations performed, and describe the forcing data used in these simulations. We identify some developments in chemistry‐climate modeling that make models more physically based or more comprehensive, including the introduction of an interactive ocean, online photolysis, troposphere‐stratosphere chemistry, and non‐orographic gravity‐wave deposition as linked to tropospheric convection. The relatively new developments indicate that stratospheric CCM modeling is becoming more consistent with our physically based understanding of the atmosphere.