856 resultados para Exclusion process, Multi-species, Multi-scale modelling
Resumo:
Insect pollination benefits over three quarters of the world's major crops. There is growing concern that observed declines in pollinators may impact on production and revenues from animal pollinated crops. Knowing the distribution of pollinators is therefore crucial for estimating their availability to pollinate crops; however, in general, we have an incomplete knowledge of where these pollinators occur. We propose a method to predict geographical patterns of pollination service to crops, novel in two elements: the use of pollinator records rather than expert knowledge to predict pollinator occurrence, and the inclusion of the managed pollinator supply. We integrated a maximum entropy species distribution model (SDM) with an existing pollination service model (PSM) to derive the availability of pollinators for crop pollination. We used nation-wide records of wild and managed pollinators (honey bees) as well as agricultural data from Great Britain. We first calibrated the SDM on a representative sample of bee and hoverfly crop pollinator species, evaluating the effects of different settings on model performance and on its capacity to identify the most important predictors. The importance of the different predictors was better resolved by SDM derived from simpler functions, with consistent results for bees and hoverflies. We then used the species distributions from the calibrated model to predict pollination service of wild and managed pollinators, using field beans as a test case. The PSM allowed us to spatially characterize the contribution of wild and managed pollinators and also identify areas potentially vulnerable to low pollination service provision, which can help direct local scale interventions. This approach can be extended to investigate geographical mismatches between crop pollination demand and the availability of pollinators, resulting from environmental change or policy scenarios.
Resumo:
This study represents the first detailed multi-proxy palaeoenvironmental investigation associated with a Late Iron Age lake-dwelling site in the eastern Baltic. The main objective was to reconstruct the environmental and vegetation dynamics associated with the establishment of the lake-dwelling and land-use during the last 2,000 years. A lacustrine sediment core located adjacent to a Late Iron Age lake-dwelling, medieval castle and Post-medieval manor was sampled in Lake Āraiši. The core was dated using spheroidal fly-ash particles and radiocarbon dating, and analysed in terms of pollen, non-pollen palynomorphs, diatoms, loss-on-ignition, magnetic susceptibility and element geochemistry. Associations between pollen and other proxies were statistically tested. During ad 1–700, the vicinity of Lake Āraiši was covered by forests and human activities were only small-scale with the first appearance of cereal pollen (Triticum and Secale cereale) after ad 400. The most significant changes in vegetation and environment occurred with the establishment of the lake-dwelling around ad 780 when the immediate surroundings of the lake were cleared for agriculture, and within the lake there were increased nutrient levels. The highest accumulation rates of coprophilous fungi coincide with the occupation of the lake-dwelling from ad 780–1050, indicating that parts of the dwelling functioned as byres for livestock. The conquest of tribal lands during the crusades resulted in changes to the ownership, administration and organisation of the land, but our results indicate that the form and type of agriculture and land-use continued much as it had during the preceding Late Iron Age.
Resumo:
The overall global-scale consequences of climate change are dependent on the distribution of impacts across regions, and there are multiple dimensions to these impacts.This paper presents a global assessment of the potential impacts of climate change across several sectors, using a harmonised set of impacts models forced by the same climate and socio-economic scenarios. Indicators of impact cover the water resources, river and coastal flooding, agriculture, natural environment and built environment sectors. Impacts are assessed under four SRES socio-economic and emissions scenarios, and the effects of uncertainty in the projected pattern of climate change are incorporated by constructing climate scenarios from 21 global climate models. There is considerable uncertainty in projected regional impacts across the climate model scenarios, and coherent assessments of impacts across sectors and regions therefore must be based on each model pattern separately; using ensemble means, for example, reduces variability between sectors and indicators. An example narrative assessment is presented in the paper. Under this narrative approximately 1 billion people would be exposed to increased water resources stress, around 450 million people exposed to increased river flooding, and 1.3 million extra people would be flooded in coastal floods each year. Crop productivity would fall in most regions, and residential energy demands would be reduced in most regions because reduced heating demands would offset higher cooling demands. Most of the global impacts on water stress and flooding would be in Asia, but the proportional impacts in the Middle East North Africa region would be larger. By 2050 there are emerging differences in impact between different emissions and socio-economic scenarios even though the changes in temperature and sea level are similar, and these differences are greater in 2080. However, for all the indicators, the range in projected impacts between different climate models is considerably greater than the range between emissions and socio-economic scenarios.
Resumo:
Multi-model ensembles are frequently used to assess understanding of the response of ozone and methane lifetime to changes in emissions of ozone precursors such as NOx, VOCs (volatile organic compounds) and CO. When these ozone changes are used to calculate radiative forcing (RF) (and climate metrics such as the global warming potential (GWP) and global temperature-change potential (GTP)) there is a methodological choice, determined partly by the available computing resources, as to whether the mean ozone (and methane) concentration changes are input to the radiation code, or whether each model's ozone and methane changes are used as input, with the average RF computed from the individual model RFs. We use data from the Task Force on Hemispheric Transport of Air Pollution source–receptor global chemical transport model ensemble to assess the impact of this choice for emission changes in four regions (East Asia, Europe, North America and South Asia). We conclude that using the multi-model mean ozone and methane responses is accurate for calculating the mean RF, with differences up to 0.6% for CO, 0.7% for VOCs and 2% for NOx. Differences of up to 60% for NOx 7% for VOCs and 3% for CO are introduced into the 20 year GWP. The differences for the 20 year GTP are smaller than for the GWP for NOx, and similar for the other species. However, estimates of the standard deviation calculated from the ensemble-mean input fields (where the standard deviation at each point on the model grid is added to or subtracted from the mean field) are almost always substantially larger in RF, GWP and GTP metrics than the true standard deviation, and can be larger than the model range for short-lived ozone RF, and for the 20 and 100 year GWP and 100 year GTP. The order of averaging has most impact on the metrics for NOx, as the net values for these quantities is the residual of the sum of terms of opposing signs. For example, the standard deviation for the 20 year GWP is 2–3 times larger using the ensemble-mean fields than using the individual models to calculate the RF. The source of this effect is largely due to the construction of the input ozone fields, which overestimate the true ensemble spread. Hence, while the average of multi-model fields are normally appropriate for calculating mean RF, GWP and GTP, they are not a reliable method for calculating the uncertainty in these fields, and in general overestimate the uncertainty.
Resumo:
An improved understanding of present-day climate variability and change relies on high-quality data sets from the past 2 millennia. Global efforts to model regional climate modes are in the process of being validated against, and integrated with, records of past vegetation change. For South America, however, the full potential of vegetation records for evaluating and improving climate models has hitherto not been sufficiently acknowledged due to an absence of information on the spatial and temporal coverage of study sites. This paper therefore serves as a guide to high-quality pollen records that capture environmental variability during the last 2 millennia. We identify 60 vegetation (pollen) records from across South America which satisfy geochronological requirements set out for climate modelling, and we discuss their sensitivity to the spatial signature of climate modes throughout the continent. Diverse patterns of vegetation response to climate change are observed, with more similar patterns of change in the lowlands and varying intensity and direction of responses in the highlands. Pollen records display local-scale responses to climate modes; thus, it is necessary to understand how vegetation–climate interactions might diverge under variable settings. We provide a qualitative translation from pollen metrics to climate variables. Additionally, pollen is an excellent indicator of human impact through time. We discuss evidence for human land use in pollen records and provide an overview considered useful for archaeological hypothesis testing and important in distinguishing natural from anthropogenically driven vegetation change. We stress the need for the palynological community to be more familiar with climate variability patterns to correctly attribute the potential causes of observed vegetation dynamics. This manuscript forms part of the wider LOng-Term multi-proxy climate REconstructions and Dynamics in South America – 2k initiative that provides the ideal framework for the integration of the various palaeoclimatic subdisciplines and palaeo-science, thereby jump-starting and fostering multidisciplinary research into environmental change on centennial and millennial timescales.
Resumo:
Cover crops are sown to provide a number of ecosystem services including nutrient management, mitigation of diffuse pollution, improving soil structure and organic matter content, weed suppression, nitrogen fixation and provision of resources for biodiversity. Although the decision to sow a cover crop may be driven by a desire to achieve just one of these objectives, the diversity of cover crops species and mixtures available means that there is potential to combine a number of ecosystem services within the same crop and growing season. Designing multi-functional cover crops would potentially help to reconcile the often conflicting agronomic and environmental agendas and contribute to the optimal use of land. We present a framework for integrating multiple ecosystem services delivered by cover crops that aims to design a mixture of species with complementary growth habit and functionality. The optimal number and identity of species will depend on the services included in the analysis, the functional space represented by the available species pool and the community dynamics of the crop in terms of dominance and co-existence. Experience from a project that applied the framework to fertility building leys in organic systems demonstrated its potential and emphasised the importance of the initial choice of species to include in the analysis
Resumo:
Floods are the most frequent of natural disasters, affecting millions of people across the globe every year. The anticipation and forecasting of floods at the global scale is crucial to preparing for severe events and providing early awareness where local flood models and warning services may not exist. As numerical weather prediction models continue to improve, operational centres are increasingly using the meteorological output from these to drive hydrological models, creating hydrometeorological systems capable of forecasting river flow and flood events at much longer lead times than has previously been possible. Furthermore, developments in, for example, modelling capabilities, data and resources in recent years have made it possible to produce global scale flood forecasting systems. In this paper, the current state of operational large scale flood forecasting is discussed, including probabilistic forecasting of floods using ensemble prediction systems. Six state-of-the-art operational large scale flood forecasting systems are reviewed, describing similarities and differences in their approaches to forecasting floods at the global and continental scale. Currently, operational systems have the capability to produce coarse-scale discharge forecasts in the medium-range and disseminate forecasts and, in some cases, early warning products, in real time across the globe, in support of national forecasting capabilities. With improvements in seasonal weather forecasting, future advances may include more seamless hydrological forecasting at the global scale, alongside a move towards multi-model forecasts and grand ensemble techniques, responding to the requirement of developing multi-hazard early warning systems for disaster risk reduction.
Resumo:
Introducing a parameterization of the interactions between wind-driven snow depth changes and melt pond evolution allows us to improve large scale models. In this paper we have implemented an explicit melt pond scheme and, for the first time, a wind dependant snow redistribution model and new snow thermophysics into a coupled ocean–sea ice model. The comparison of long-term mean statistics of melt pond fractions against observations demonstrates realistic melt pond cover on average over Arctic sea ice, but a clear underestimation of the pond coverage on the multi-year ice (MYI) of the western Arctic Ocean. The latter shortcoming originates from the concealing effect of persistent snow on forming ponds, impeding their growth. Analyzing a second simulation with intensified snow drift enables the identification of two distinct modes of sensitivity in the melt pond formation process. First, the larger proportion of wind-transported snow that is lost in leads directly curtails the late spring snow volume on sea ice and facilitates the early development of melt ponds on MYI. In contrast, a combination of higher air temperatures and thinner snow prior to the onset of melting sometimes make the snow cover switch to a regime where it melts entirely and rapidly. In the latter situation, seemingly more frequent on first-year ice (FYI), a smaller snow volume directly relates to a reduced melt pond cover. Notwithstanding, changes in snow and water accumulation on seasonal sea ice is naturally limited, which lessens the impacts of wind-blown snow redistribution on FYI, as compared to those on MYI. At the basin scale, the overall increased melt pond cover results in decreased ice volume via the ice-albedo feedback in summer, which is experienced almost exclusively by MYI.
Resumo:
Precipitation is expected to respond differently to various drivers of anthropogenic climate change. We present the first results from the Precipitation Driver and Response Model Intercomparison Project (PDRMIP), where nine global climate models have perturbed CO2, CH4, black carbon, sulfate, and solar insolation. We divide the resulting changes to global mean and regional precipitation into fast responses that scale with changes in atmospheric absorption and slow responses scaling with surface temperature change. While the overall features are broadly similar between models, we find significant regional intermodel variability, especially over land. Black carbon stands out as a component that may cause significant model diversity in predicted precipitation change. Processes linked to atmospheric absorption are less consistently modeled than those linked to top-of-atmosphere radiative forcing. We identify a number of land regions where the model ensemble consistently predicts that fast precipitation responses to climate perturbations dominate over the slow, temperature-driven responses.
Resumo:
The optimal formulation for the preparation of amaranth flour films plasticized with glycerol and sorbitol was obtained by a multi-response analysis. The optimization aimed to achieve films with higher resistance to break, moderate elongation and lower solubility in water. The influence of plasticizer concentration (Cg, glycerol or Cs, sorbitol) and process temperature (Tp) on the mechanical properties and solubility of the amaranth flour films was initially studied by response surface methodology (RSM). The optimized conditions obtained were Cg 20.02 g glycerol/100 g flour and Tp 75 degrees C, and Cs 29.6 g sorbitol/100 g flour and Tp 75 degrees C. Characterization of the films prepared with these formulations revealed that the optimization methodology employed in this work was satisfactory. Sorbitol was the most suitable plasticizer. It furnished amaranth flour films that were more resistant to break and less permeable to oxygen, due to its greater miscibility with the biopolymers present in the flour and its lower affinity for water. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The globular cluster HP 1 is projected on the bulge, very close to the Galactic center. The Multi-Conjugate Adaptive Optics Demonstrator on the Very Large Telescope allowed us to acquire high-resolution deep images that, combined with first epoch New Technology Telescope data, enabled us to derive accurate proper motions. The cluster and bulge fields` stellar contents were disentangled through this process and produced an unprecedented definition in color-magnitude diagrams of this cluster. The metallicity of [Fe/H] approximate to -1.0 from previous spectroscopic analysis is confirmed, which together with an extended blue horizontal branch imply an age older than the halo average. Orbit reconstruction results suggest that HP 1 is spatially confined within the bulge.
Resumo:
Successful classification, information retrieval and image analysis tools are intimately related with the quality of the features employed in the process. Pixel intensities, color, texture and shape are, generally, the basis from which most of the features are Computed and used in such fields. This papers presents a novel shape-based feature extraction approach where an image is decomposed into multiple contours, and further characterized by Fourier descriptors. Unlike traditional approaches we make use of topological knowledge to generate well-defined closed contours, which are efficient signatures for image retrieval. The method has been evaluated in the CBIR context and image analysis. The results have shown that the multi-contour decomposition, as opposed to a single shape information, introduced a significant improvement in the discrimination power. (c) 2008 Elsevier B.V. All rights reserved,
Resumo:
Internet of Things är ett samlingsbegrepp för den utveckling som innebär att olika typer av enheter kan förses med sensorer och datachip som är uppkopplade mot internet. En ökad mängd data innebär en ökad förfrågan på lösningar som kan lagra, spåra, analysera och bearbeta data. Ett sätt att möta denna förfrågan är att använda sig av molnbaserade realtidsanalystjänster. Multi-tenant och single-tenant är två typer av arkitekturer för molnbaserade realtidsanalystjänster som kan användas för att lösa problemen med hanteringen av de ökade datamängderna. Dessa arkitekturer skiljer sig åt när det gäller komplexitet i utvecklingen. I detta arbete representerar Azure Stream Analytics en multi-tenant arkitektur och HDInsight/Storm representerar en single-tenant arkitektur. För att kunna göra en jämförelse av molnbaserade realtidsanalystjänster med olika arkitekturer, har vi valt att använda oss av användbarhetskriterierna: effektivitet, ändamålsenlighet och användarnöjdhet. Vi kom fram till att vi ville ha svar på följande frågor relaterade till ovannämnda tre användbarhetskriterier: • Vilka likheter och skillnader kan vi se i utvecklingstider? • Kan vi identifiera skillnader i funktionalitet? • Hur upplever utvecklare de olika analystjänsterna? Vi har använt en design and creation strategi för att utveckla två Proof of Concept prototyper och samlat in data genom att använda flera datainsamlingsmetoder. Proof of Concept prototyperna inkluderade två artefakter, en för Azure Stream Analytics och en för HDInsight/Storm. Vi utvärderade dessa genom att utföra fem olika scenarier som var för sig hade 2-5 delmål. Vi simulerade strömmande data genom att låta en applikation kontinuerligt slumpa fram data som vi analyserade med hjälp av de två realtidsanalystjänsterna. Vi har använt oss av observationer för att dokumentera hur vi arbetade med utvecklingen av analystjänsterna samt för att mäta utvecklingstider och identifiera skillnader i funktionalitet. Vi har även använt oss av frågeformulär för att ta reda på vad användare tyckte om analystjänsterna. Vi kom fram till att Azure Stream Analytics initialt var mer användbart än HDInsight/Storm men att skillnaderna minskade efter hand. Azure Stream Analytics var lättare att arbeta med vid simplare analyser medan HDInsight/Storm hade ett bredare val av funktionalitet.
Resumo:
This paper reports the findings of using multi-agent based simulation model to evaluate the sawmill yard operations within a large privately owned sawmill in Sweden, Bergkvist Insjön AB in the current case. Conventional working routines within sawmill yard threaten the overall efficiency and thereby limit the profit margin of sawmill. Deploying dynamic work routines within the sawmill yard is not readily feasible in real time, so discrete event simulation model has been investigated to be able to report optimal work order depending on the situations. Preliminary investigations indicate that the results achieved by simulation model are promising. It is expected that the results achieved in the current case will support Bergkvist-Insjön AB in making optimal decisions by deploying efficient work order in sawmill yard.
Resumo:
Determining the provenance of data, i.e. the process that led to that data, is vital in many disciplines. For example, in science, the process that produced a given result must be demonstrably rigorous for the result to be deemed reliable. A provenance system supports applications in recording adequate documentation about process executions to answer queries regarding provenance, and provides functionality to perform those queries. Several provenance systems are being developed, but all focus on systems in which the components are textitreactive, for example Web Services that act on the basis of a request, job submission system, etc. This limitation means that questions regarding the motives of autonomous actors, or textitagents, in such systems remain unanswerable in the general case. Such questions include: who was ultimately responsible for a given effect, what was their reason for initiating the process and does the effect of a process match what was intended to occur by those initiating the process? In this paper, we address this limitation by integrating two solutions: a generic, re-usable framework for representing the provenance of data in service-oriented architectures and a model for describing the goal-oriented delegation and engagement of agents in multi-agent systems. Using these solutions, we present algorithms to answer common questions regarding responsibility and success of a process and evaluate the approach with a simulated healthcare example.