813 resultados para feature based modelling
Resumo:
Considerable effort is presently being devoted to producing high-resolution sea surface temperature (SST) analyses with a goal of spatial grid resolutions as low as 1 km. Because grid resolution is not the same as feature resolution, a method is needed to objectively determine the resolution capability and accuracy of SST analysis products. Ocean model SST fields are used in this study as simulated “true” SST data and subsampled based on actual infrared and microwave satellite data coverage. The subsampled data are used to simulate sampling errors due to missing data. Two different SST analyses are considered and run using both the full and the subsampled model SST fields, with and without additional noise. The results are compared as a function of spatial scales of variability using wavenumber auto- and cross-spectral analysis. The spectral variance at high wavenumbers (smallest wavelengths) is shown to be attenuated relative to the true SST because of smoothing that is inherent to both analysis procedures. Comparisons of the two analyses (both having grid sizes of roughly ) show important differences. One analysis tends to reproduce small-scale features more accurately when the high-resolution data coverage is good but produces more spurious small-scale noise when the high-resolution data coverage is poor. Analysis procedures can thus generate small-scale features with and without data, but the small-scale features in an SST analysis may be just noise when high-resolution data are sparse. Users must therefore be skeptical of high-resolution SST products, especially in regions where high-resolution (~5 km) infrared satellite data are limited because of cloud cover.
Resumo:
Question: What are the correlations between the degree of drought stress and temperature, and the adoption of specific adaptive strategies by plants in the Mediterranean region? Location: 602 sites across the Mediterranean region. Method: We considered 12 plant morphological and phenological traits, and measured their abundance at the sites as trait scores obtained from pollen percentages. We conducted stepwise regression analyses of trait scores as a function of plant available moisture (α) and winter temperature (MTCO). Results: Patterns in the abundance for the plant traits we considered are clearly determined by α, MTCO or a combination of both. In addition, trends in leaf size, texture, thickness, pubescence and aromatic leaves and other plant level traits such as thorniness and aphylly, vary according to the life form (tree, shrub, forb), the leaf type (broad, needle) and phenology (evergreen, summer-green). Conclusions: Despite conducting this study based on pollen data we have identified ecologically plausible trends in the abundance of traits along climatic gradients. Plant traits other than the usual life form, leaf type and leaf phenology carry strong climatic signals. Generally, combinations of plant traits are more climatically diagnostic than individual traits. The qualitative and quantitative relationships between plant traits and climate parameters established here will help to provide an improved basis for modelling the impact of climate changes on vegetation and form a starting point for a global analysis of pollen-climate relationships
Resumo:
Aim This paper documents reconstructions of the vegetation patterns in Australia, Southeast Asia and the Pacific (SEAPAC region) in the mid-Holocene and at the last glacial maximum (LGM). Methods Vegetation patterns were reconstructed from pollen data using an objective biomization scheme based on plant functional types. The biomization scheme was first tested using 535 modern pollen samples from 377 sites, and then applied unchanged to fossil pollen samples dating to 6000 ± 500 or 18,000 ± 1000 14C yr bp. Results 1. Tests using surface pollen sample sites showed that the biomization scheme is capable of reproducing the modern broad-scale patterns of vegetation distribution. The north–south gradient in temperature, reflected in transitions from cool evergreen needleleaf forest in the extreme south through temperate rain forest or wet sclerophyll forest (WSFW) and into tropical forests, is well reconstructed. The transitions from xerophytic through sclerophyll woodlands and open forests to closed-canopy forests, which reflect the gradient in plant available moisture from the continental interior towards the coast, are reconstructed with less geographical precision but nevertheless the broad-scale pattern emerges. 2. Differences between the modern and mid-Holocene vegetation patterns in mainland Australia are comparatively small and reflect changes in moisture availability rather than temperature. In south-eastern Australia some sites show a shift towards more moisture-stressed vegetation in the mid-Holocene with xerophytic woods/scrub and temperate sclerophyll woodland and shrubland at sites characterized today by WSFW or warm-temperate rain forest (WTRF). However, sites in the Snowy Mountains, on the Southern Tablelands and east of the Great Dividing Range have more moisture-demanding vegetation in the mid-Holocene than today. South-western Australia was slightly drier than today. The single site in north-western Australia also shows conditions drier than today in the mid-Holocene. Changes in the tropics are also comparatively small, but the presence of WTRF and tropical deciduous broadleaf forest and woodland in the mid-Holocene, in sites occupied today by cool-temperate rain forest, indicate warmer conditions. 3. Expansion of xerophytic vegetation in the south and tropical deciduous broadleaf forest and woodland in the north indicate drier conditions across mainland Australia at the LGM. None of these changes are informative about the degree of cooling. However the evidence from the tropics, showing lowering of the treeline and forest belts, indicates that conditions were between 1 and 9 °C (depending on elevation) colder. The encroachment of tropical deciduous broadleaf forest and woodland into lowland evergreen broadleaf forest implies greater aridity. Main conclusions This study provides the first continental-scale reconstruction of mid-Holocene and LGM vegetation patterns from Australia, Southeast Asia and the Pacific (SEAPAC region) using an objective biomization scheme. These data will provide a benchmark for evaluation of palaeoclimate simulations within the framework of the Palaeoclimate Modelling Intercomparison Project.
Resumo:
A key step in many numerical schemes for time-dependent partial differential equations with moving boundaries is to rescale the problem to a fixed numerical mesh. An alternative approach is to use a moving mesh that can be adapted to focus on specific features of the model. In this paper we present and discuss two different velocity-based moving mesh methods applied to a two-phase model of avascular tumour growth formulated by Breward et al. (2002) J. Math. Biol. 45(2), 125-152. Each method has one moving node which tracks the moving boundary. The first moving mesh method uses a mesh velocity proportional to the boundary velocity. The second moving mesh method uses local conservation of volume fraction of cells (masses). Our results demonstrate that these moving mesh methods produce accurate results, offering higher resolution where desired whilst preserving the balance of fluxes and sources in the governing equations.
Resumo:
almonella enterica serovar Typhimurium is an established model organism for Gram-negative, intracellular pathogens. Owing to the rapid spread of resistance to antibiotics among this group of pathogens, new approaches to identify suitable target proteins are required. Based on the genome sequence of Salmonella Typhimurium and associated databases, a genome-scale metabolic model was constructed. Output was based on an experimental determination of the biomass of Salmonella when growing in glucose minimal medium. Linear programming was used to simulate variations in energy demand, while growing in glucose minimal medium. By grouping reactions with similar flux responses, a sub-network of 34 reactions responding to this variation was identified (the catabolic core). This network was used to identify sets of one and two reactions, that when removed from the genome-scale model interfered with energy and biomass generation. 11 such sets were found to be essential for the production of biomass precursors. Experimental investigation of 7 of these showed that knock-outs of the associated genes resulted in attenuated growth for 4 pairs of reactions, while 3 single reactions were shown to be essential for growth.
Resumo:
Context: Variation in photosynthetic activity of trees induced by climatic stress can be effectively evaluated using remote sensing data. Although adverse effects of climate on temperate forests have been subjected to increased scrutiny, the suitability of remote sensing imagery for identification of drought stress in such forests has not been explored fully. Aim: To evaluate the sensitivity of MODIS-based vegetation index to heat and drought stress in temperate forests, and explore the differences in stress response of oaks and beech. Methods: We identified 8 oak and 13 beech pure and mature stands, each covering between 4 and 13 MODIS pixels. For each pixel, we extracted a time series of MODIS NDVI from 2000 to 2010. We identified all sequences of continuous unseasonal NDVI decline to be used as the response variable indicative of environmental stress. Neural Networks-based regression modelling was then applied to identify the climatic variables that best explain observed NDVI declines. Results: Tested variables explained 84–97% of the variation in NDVI, whilst air temperature-related climate extremes were found to be the most influential. Beech showed a linear response to the most influential climatic predictors, while oak responded in a unimodal pattern suggesting a better coping mechanism. Conclusions: MODIS NDVI has proved sufficiently sensitive as a stand-level indicator of climatic stress acting upon temperate broadleaf forests, leading to its potential use in predicting drought stress from meteorological observations and improving parameterisation of forest stress indices.
Resumo:
The Maritime Continent archipelago, situated on the equator at 95-165E, has the strongest land-based precipitation on Earth. The latent heat release associated with the rainfall affects the atmospheric circulation throughout the tropics and into the extra-tropics. The greatest source of variability in precipitation is the diurnal cycle. The archipelago is within the convective region of the Madden-Julian Oscillation (MJO), which provides the greatest variability on intra-seasonal time scales: large-scale (∼10^7 km^2) active and suppressed convective envelopes propagate slowly (∼5 m s^-1) eastwards between the Indian and Pacific Oceans. High-resolution satellite data show that a strong diurnal cycle is triggered to the east of the advancing MJO envelope, leading the active MJO by one-eighth of an MJO cycle (∼6 days). Where the diurnal cycle is strong its modulation accounts for 81% of the variability in MJO precipitation. Over land this determines the structure of the diagnosed MJO. This is consistent with the equatorial wave dynamics in existing theories of MJO propagation. The MJO also affects the speed of gravity waves propagating offshore from the Maritime Continent islands. This is largely consistent with changes in static stability during the MJO cycle. The MJO and its interaction with the diurnal cycle are investigated in HiGEM, a high-resolution coupled model. Unlike many models, HiGEM represents the MJO well with eastward-propagating variability on intra-seasonal time scales at the correct zonal wavenumber, although the inter-tropical convergence zone's precipitation peaks strongly at the wrong time, interrupting the MJO's spatial structure. However, the modelled diurnal cycle is too weak and its phase is too early over land. The modulation of the diurnal amplitude by the MJO is also too weak and accounts for only 51% of the variability in MJO precipitation. Implications for forecasting and possible causes of the model errors are discussed, and further modelling studies are proposed.
Resumo:
Recent studies showed that features extracted from brain MRIs can well discriminate Alzheimer’s disease from Mild Cognitive Impairment. This study provides an algorithm that sequentially applies advanced feature selection methods for findings the best subset of features in terms of binary classification accuracy. The classifiers that provided the highest accuracies, have been then used for solving a multi-class problem by the one-versus-one strategy. Although several approaches based on Regions of Interest (ROIs) extraction exist, the prediction power of features has not yet investigated by comparing filter and wrapper techniques. The findings of this work suggest that (i) the IntraCranial Volume (ICV) normalization can lead to overfitting and worst the accuracy prediction of test set and (ii) the combined use of a Random Forest-based filter with a Support Vector Machines-based wrapper, improves accuracy of binary classification.
Resumo:
Understanding how and why the capability of one set of business resources, its structural arrangements and mechanisms compared to another works can provide competitive advantage in terms of new business processes and product and service development. However, most business models of capability are descriptive and lack formal modelling language to qualitatively and quantifiably compare capabilities, Gibson’s theory of affordance, the potential for action, provides a formal basis for a more robust and quantitative model, but most formal affordance models are complex and abstract and lack support for real-world applications. We aim to understand the ‘how’ and ‘why’ of business capability, by developing a quantitative and qualitative model that underpins earlier work on Capability-Affordance Modelling – CAM. This paper integrates an affordance based capability model and the formalism of Coloured Petri Nets to develop a simulation model. Using the model, we show how capability depends on the space time path of interacting resources, the mechanism of transition and specific critical affordance factors relating to the values of the variables for resources, people and physical objects. We show how the model can identify the capabilities of resources to enable the capability to inject a drug and anaesthetise a patient.
Resumo:
The problem of technology obsolescence in information intensive businesses (software and hardware no longer being supported and replaced by improved and different solutions) and a cost constrained market can severely increase costs and operational, and ultimately reputation risk. Although many businesses recognise technological obsolescence, the pervasive nature of technology often means they have little information to identify the risk and location of pending obsolescence and little money to apply to the solution. This paper presents a low cost structured method to identify obsolete software and the risk of their obsolescence where the structure of a business and its supporting IT resources can be captured, modelled, analysed and the risk to the business of technology obsolescence identified to enable remedial action using qualified obsolescence information. The technique is based on a structured modelling approach using enterprise architecture models and a heatmap algorithm to highlight high risk obsolescent elements. The method has been tested and applied in practice in two consulting studies carried out by Capgemini involving three UK police forces. However the generic technique could be applied to any industry based on plans to improve it using ontology framework methods. This paper contains details of enterprise architecture meta-models and related modelling.
Resumo:
Facility management (FM), from a service oriented approach, addresses the functions and requirements of different services such as energy management, space planning and security service. Different service requires different information to meet the needs arising from the service. Object-based Building Information Modelling (BIM) is limited to support FM services; though this technology is able to generate 3D models that semantically represent facility’s information dynamically over the lifecycle of a building. This paper presents a semiotics-inspired framework to extend BIM from a service-oriented perspective. The extended BIM, which specifies FM services and required information, will be able to express building service information in the right format for the right purposes. The service oriented approach concerns pragmatic aspect of building’s information beyond semantic level. The pragmatics defines and provides context for utilisation of building’s information. Semiotics theory adopted in this paper is to address pragmatic issues of utilisation of BIM for FM services.
Resumo:
We consider the forecasting of macroeconomic variables that are subject to revisions, using Bayesian vintage-based vector autoregressions. The prior incorporates the belief that, after the first few data releases, subsequent ones are likely to consist of revisions that are largely unpredictable. The Bayesian approach allows the joint modelling of the data revisions of more than one variable, while keeping the concomitant increase in parameter estimation uncertainty manageable. Our model provides markedly more accurate forecasts of post-revision values of inflation than do other models in the literature.
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.
Resumo:
Population modelling is increasingly recognised as a useful tool for pesticide risk assessment. For vertebrates that may ingest pesticides with their food, such as woodpigeon (Columba palumbus), population models that simulate foraging behaviour explicitly can help predicting both exposure and population-level impact. Optimal foraging theory is often assumed to explain the individual-level decisions driving distributions of individuals in the field, but it may not adequately predict spatial and temporal characteristics of woodpigeon foraging because of the woodpigeons’ excellent memory, ability to fly long distances, and distinctive flocking behaviour. Here we present an individual-based model (IBM) of the woodpigeon. We used the model to predict distributions of foraging woodpigeons that use one of six alternative foraging strategies: optimal foraging, memory-based foraging and random foraging, each with or without flocking mechanisms. We used pattern-oriented modelling to determine which of the foraging strategies is best able to reproduce observed data patterns. Data used for model evaluation were gathered during a long-term woodpigeon study conducted between 1961 and 2004 and a radiotracking study conducted in 2003 and 2004, both in the UK, and are summarised here as three complex patterns: the distributions of foraging birds between vegetation types during the year, the number of fields visited daily by individuals, and the proportion of fields revisited by them on subsequent days. The model with a memory-based foraging strategy and a flocking mechanism was the only one to reproduce these three data patterns, and the optimal foraging model produced poor matches to all of them. The random foraging strategy reproduced two of the three patterns but was not able to guarantee population persistence. We conclude that with the memory-based foraging strategy including a flocking mechanism our model is realistic enough to estimate the potential exposure of woodpigeons to pesticides. We discuss how exposure can be linked to our model, and how the model could be used for risk assessment of pesticides, for example predicting exposure and effects in heterogeneous landscapes planted seasonally with a variety of crops, while accounting for differences in land use between landscapes.