75 resultados para Agent-based model
Resumo:
The aim of this three year project funded by the Countryside Council for Wales (CCW) is to develop techniques firstly, to refine and update existing targets for habitat restoration and re-creation at the landscape scale and secondly, to develop a GIS-based model for the implementation of those targets at the local scale. Landscape Character Assessment (LCA) is being used to map Landscape Types across the whole of Wales as the first stage towards setting strategic habitat targets. The GIS habitat model uses data from the digital Phase I Habitat Survey for Wales to determine the suitability of individual sites for restoration to specific habitat types, including broadleaf woodland. The long-term aim is to develop a system that strengthens the character of Welsh landscapes and provides real biodiversity benefits based upon realistic targets given limited resources for habitat restoration and re-creation.
Predictive vegetation mapping in the Mediterranean context: Considerations and methodological issues
Resumo:
The need to map vegetation communities over large areas for nature conservation and to predict the impact of environmental change on vegetation distributions, has stimulated the development of techniques for predictive vegetation mapping. Predictive vegetation studies start with the development of a model relating vegetation units and mapped physical data, followed by the application of that model to a geographic database and over a wide range of spatial scales. This field is particularly important for identifying sites for rare and endangered species and locations of high biodiversity such as many areas of the Mediterranean Basin. The potential of the approach is illustrated with a mapping exercise in the alti-meditterranean zone of Lefka Ori in Crete. The study established the nature of the relationship between vegetation communities and physical data including altitude, slope and geomorphology. In this way the knowledge of community distribution was improved enabling a GIS-based model capable of predicting community distribution to be constructed. The paper describes the development of the spatial model and the methodological problems of predictive mapping for monitoring Mediterranean ecosystems. The paper concludes with a discussion of the role of predictive vegetation mapping and other spatial techniques, such as fuzzy mapping and geostatistics, for improving our understanding of the dynamics of Mediterranean ecosystems and for practical management in a region that is under increasing pressure from human impact.
Resumo:
This paper reports three experiments that examine the role of similarity processing in McGeorge and Burton's (1990) incidental learning task. In the experiments subjects performed a distractor task involving four-digit number strings, all of which conformed to a simple hidden rule. They were then given a forced-choice memory test in which they were presented with pairs of strings and were led to believe that one string of each pair had appeared in the prior learning phase. Although this was not the case, one string of each pair did conform to the hidden rule. Experiment 1 showed that, as in the McGeorge and Burton study, subjects were significantly more likely to select test strings that conformed to the hidden rule. However, additional analyses suggested that rather than having implicitly abstracted the rule, subjects may have been selecting strings that were in some way similar to those seen during the learning phase. Experiments 2 and 3 were designed to try to separate out effects due to similarity from those due to implicit rule abstraction. It was found that the results were more consistent with a similarity-based model than implicit rule abstraction per se.
Resumo:
This paper focuses on successful reform strategies invoked in parts of the Muslim world to address issues of gender inequality in the context of Islamic personal law. It traces the development of personal status laws in Tunisia and Morocco, exploring the models they offer in initiating equality-enhancing reforms in Bangladesh, where a secular and equality-based reform approach conflicts with Islamic-based conservatism. Recent landmark family law reforms in Morocco show the possibility of achieving ‘women-friendly’ reforms within an Islamic legal framework. Moreover, the Tunisian Personal Status Code, with its successive reforms, shows that a gender equality-based model of personal law can be successfully integrated into the Muslim way of life. This study examines the response of Muslim societies to equality-based reforms and differences in approach in initiating them. The paper maps these sometimes competing approaches, locating them within contemporary feminist debates related to gender equality in the East and West.
Resumo:
The emergence behaviour of weed species in relation to cultural and meteorological events was studied. Dissimilarities between populations in dormancy and germination ecology, between-year maturation conditions and seed quality and burial site climate all contribute to potentially unpredictable variability. Therefore, a weed emergence data set was produced for weed seeds of Stellaria media and Chenopodium album matured and collected from three populations (Italy, Sweden and UK). The seeds were collected in two consecutive seasons (1999 and 2000) and subsequently buried in the autumn of the same year of maturation in eight contrasting climatic locations throughout Europe and the USA. The experiment sought to explore and explain differences between the three populations in their emergence behaviour. Evidence was demonstrated of synchrony in the timing of the emergence of different populations of a species at a given burial site. The relative magnitudes of emergence from the three populations at a given burial site in a given year were generally similar across all the burial sites in the study. The resulting data set was also used to construct a simple weed emergence model, which was tested for its application to the range of different burial environments and populations. The study demonstrated the possibility of using a simple thermal time-based model to describe part of the emergence behaviour across different burial sites, seed populations and seasons, and a simple winter chilling relationship to adjust for the magnitude of the flush of emergence at a given burial site. This study demonstrates the possibility of developing robust generic models for simple predictions of emergence timing across populations.
Resumo:
The main objectives of this paper are to: firstly, identify key issues related to sustainable intelligent buildings (environmental, social, economic and technological factors); develop a conceptual model for the selection of the appropriate KPIs; secondly, test critically stakeholder's perceptions and values of selected KPIs intelligent buildings; and thirdly develop a new model for measuring the level of sustainability for sustainable intelligent buildings. This paper uses a consensus-based model (Sustainable Built Environment Tool- SuBETool), which is analysed using the analytical hierarchical process (AHP) for multi-criteria decision-making. The use of the multi-attribute model for priority setting in the sustainability assessment of intelligent buildings is introduced. The paper commences by reviewing the literature on sustainable intelligent buildings research and presents a pilot-study investigating the problems of complexity and subjectivity. This study is based upon a survey perceptions held by selected stakeholders and the value they attribute to selected KPIs. It is argued that the benefit of the new proposed model (SuBETool) is a ‘tool’ for ‘comparative’ rather than an absolute measurement. It has the potential to provide useful lessons from current sustainability assessment methods for strategic future of sustainable intelligent buildings in order to improve a building's performance and to deliver objective outcomes. Findings of this survey enrich the field of intelligent buildings in two ways. Firstly, it gives a detailed insight into the selection of sustainable building indicators, as well as their degree of importance. Secondly, it tesst critically stakeholder's perceptions and values of selected KPIs intelligent buildings. It is concluded that the priority levels for selected criteria is largely dependent on the integrated design team, which includes the client, architects, engineers and facilities managers.
Resumo:
This report addresses the extent that managerial practices can be shared between the aerospace and construction sectors. Current recipes for learning from other industries tend to be oversimplistic and often fail to recognise the embedded and contextual nature of managerial knowledge. Knowledge sharing between business sectors is best understood as an essential source of innovation. The process of comparison challenges assumptions and better equips managers to cope with future change. Comparisons between the aerospace and construction sectors are especially useful because they are so different. The two sectors differ hugely in terms of their institutional context, structure and technological intensity. The aerospace sector has experienced extensive consolidation and is dominated by a small number of global companies. Aerospace companies operate within complex networks of global interdependency such that collaborative working is a commercial imperative. In contrast, the construction sector remains highly fragmented and is characterised by a continued reliance on small firms. The vast majority of construction firms compete within localised markets that are too often characterised by opportunistic behaviour. Comparing construction to aerospace highlights the unique characteristics of both sectors and helps explain how managerial practices are mediated by context. Detailed comparisons between the two sectors are made in a range of areas and guidance is provided for the implementation of knowledge sharing strategies within and across organisations. The commonly accepted notion of ‘best practice’ is exposed as a myth. Indeed, universal models of best practice can be detrimental to performance by deflecting from the need to adapt continuously to changing circumstances. Competitiveness in the construction sector too often rests on efficiency in managing contracts, with a particular emphasis on the allocation of risk. Innovation in construction tends to be problem-driven and is rarely shared from project to project. In aerospace, the dominant model of competitiveness means that firms have little choice other than to invest in continuous innovation, despite difficult trading conditions. Research and development (R&D) expenditure in aerospace continues to rise as a percentage of turnovers. A sustained capacity for innovation within the aerospace sector depends crucially upon stability and continuity of work. In the construction sector, the emergence of the ‘hollowed-out’ firm has undermined the industry’s capacity for innovation. Integrated procurement contexts such as prime contracting in construction potentially provide a more supportive climate for an innovation-based model of competitiveness. However, investment in new ways of working depends upon a shift in thinking not only amongst construction contractors, but also amongst the industry’s major clients.
Resumo:
Space applications demand the need for building reliable systems. Autonomic computing defines such reliable systems as self-managing systems. The work reported in this paper combines agent-based and swarm robotic approaches leading to swarm-array computing, a novel technique to achieve self-managing distributed parallel computing systems. Two swarm-array computing approaches based on swarms of computational resources and swarms of tasks are explored. FPGA is considered as the computing system. The feasibility of the two proposed approaches that binds the computing system and the task together is simulated on the SeSAm multi-agent simulator.
Resumo:
Space applications demand the need for building reliable systems. Autonomic computing defines such reliable systems as self-managing systems. The work reported in this paper combines agent-based and swarm robotic approaches leading to swarm-array computing, a novel technique to achieve self-managing distributed parallel computing systems. Two swarm-array computing approaches based on swarms of computational resources and swarms of tasks are explored. FPGA is considered as the computing system. The feasibility of the two proposed approaches that binds the computing system and the task together is simulated on the SeSAm multi-agent simulator.
Resumo:
The deployment of Quality of Service (QoS) techniques involves careful analysis of area including: those business requirements; corporate strategy; and technical implementation process, which can lead to conflict or contradiction between those goals of various user groups involved in that policy definition. In addition long-term change management provides a challenge as these implementations typically require a high-skill set and experience level, which expose organisations to effects such as “hyperthymestria” [1] and “The Seven Sins of Memory”, defined by Schacter and discussed further within this paper. It is proposed that, given the information embedded within the packets of IP traffic, an opportunity exists to augment the traffic management with a machine-learning agent-based mechanism. This paper describes the process by which current policies are defined and that research required to support the development of an application which enables adaptive intelligent Quality of Service controls to augment or replace those policy-based mechanisms currently in use.
Resumo:
Urban surveillance footage can be of poor quality, partly due to the low quality of the camera and partly due to harsh lighting and heavily reflective scenes. For some computer surveillance tasks very simple change detection is adequate, but sometimes a more detailed change detection mask is desirable, eg, for accurately tracking identity when faced with multiple interacting individuals and in pose-based behaviour recognition. We present a novel technique for enhancing a low-quality change detection into a better segmentation using an image combing estimator in an MRF based model.
Resumo:
Space applications demand the need for building reliable systems. Autonomic computing defines such reliable systems as self-managing systems. The work reported in this paper combines agent based and swarm robotic approaches leading to swarm-array computing, a novel technique to achieve autonomy for distributed parallel computing systems. Two swarm-array computing approaches based on swarms of computational resources and swarms of tasks are explored. FPGA is considered as the computing system. The feasibility of the two proposed approaches that binds the computing system and the task together is simulated on the SeSAm multi-agent simulator.
Modelling sediment supply and transport in the River Lugg: strategies for controlling sediment loads
Resumo:
The River Lugg has particular problems with high sediment loads that have resulted in detrimental impacts on ecology and fisheries. A new dynamic, process-based model of hydrology and sediments (INCA- SED) has been developed and applied to the River Lugg system using an extensive data set from 1995–2008. The model simulates sediment sources and sinks throughout the catchment and gives a good representation of the sediment response at 22 reaches along the River Lugg. A key question considered in using the model is the management of sediment sources so that concentrations and bed loads can be reduced in the river system. Altogether, five sediment management scenarios were selected for testing on the River Lugg, including land use change, contour tillage, hedging and buffer strips. Running the model with parameters altered to simulate these five scenarios produced some interesting results. All scenarios achieved some reduction in sediment levels, with the 40% land use change achieving the best result with a 19% reduction. The other scenarios also achieved significant reductions of between 7% and 9%. Buffer strips produce the best result at close to 9%. The results suggest that if hedge introduction, contour tillage and buffer strips were all applied, sediment reductions would total 24%, considerably improving the current sediment situation. We present a novel cost-effectiveness analysis of our results where we use percentage of land removed from production as our cost function. Given the minimal loss of land associated with contour tillage, hedges and buffer strips, we suggest that these management practices are the most cost-effective combination to reduce sediment loads.
Resumo:
Brand competition is modelled using an agent based approach in order to examine the long run dynamics of market structure and brand characteristics. A repeated game is designed where myopic firms choose strategies based on beliefs about their rivals and consumers. Consumers are heterogeneous and can observe neighbour behaviour through social networks. Although firms do not observe them, the social networks have a significant impact on the emerging market structure. Presence of networks tends to polarize market share and leads to higher volatility in brands. Yet convergence in brand characteristics usually happens whenever the market reaches a steady state. Scale-free networks accentuate the polarization and volatility more than small world or random networks. Unilateral innovations are less frequent under social networks.
Resumo:
The work reported in this paper is motivated towards handling single node failures for parallel summation algorithms in computer clusters. An agent based approach is proposed in which a task to be executed is decomposed to sub-tasks and mapped onto agents that traverse computing nodes. The agents intercommunicate across computing nodes to share information during the event of a predicted node failure. Two single node failure scenarios are considered. The Message Passing Interface is employed for implementing the proposed approach. Quantitative results obtained from experiments reveal that the agent based approach can handle failures more efficiently than traditional failure handling approaches.